Albert Cardona<p>Plots of a ReLU function (Rectified Linear Unit) and a ReLU tamed by tanh to keep its activation values below 1, which is far more realistic for modeling a neuron's activation function:<br><a href="https://www.desmos.com/calculator/jcx2xcgd7m" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">desmos.com/calculator/jcx2xcgd</span><span class="invisible">7m</span></a></p><p>"Desmos Studio is a Public Benefit Corporation with a goal of helping everyone learn math, love math, and grow with math."</p><p><a href="https://mathstodon.xyz/tags/graphs" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>graphs</span></a> <a href="https://mathstodon.xyz/tags/desmos" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>desmos</span></a> <a href="https://mathstodon.xyz/tags/plots" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>plots</span></a></p>