Artwork

Content provided by Kyle Polich. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Kyle Polich or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

[MINI] Activation Functions

14:11
 
Share
 

Manage episode 181214610 series 1361404
Content provided by Kyle Polich. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Kyle Polich or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In a neural network, the output value of a neuron is almost always transformed in some way using a function. A trivial choice would be a linear transformation which can only scale the data. However, other transformations, like a step function allow for non-linear properties to be introduced.

Activation functions can also help to standardize your data between layers. Some functions such as the sigmoid have the effect of "focusing" the area of interest on data. Extreme values are placed close together, while values near it's point of inflection change more quickly with respect to small changes in the input. Similarly, these functions can take any real number and map all of them to a finite range such as [0, 1] which can have many advantages for downstream calculation.

In this episode, we overview the concept and discuss a few reasons why you might select one function verse another.

  continue reading

546 episodes

Artwork

[MINI] Activation Functions

Data Skeptic

280 subscribers

published

iconShare
 
Manage episode 181214610 series 1361404
Content provided by Kyle Polich. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Kyle Polich or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

In a neural network, the output value of a neuron is almost always transformed in some way using a function. A trivial choice would be a linear transformation which can only scale the data. However, other transformations, like a step function allow for non-linear properties to be introduced.

Activation functions can also help to standardize your data between layers. Some functions such as the sigmoid have the effect of "focusing" the area of interest on data. Extreme values are placed close together, while values near it's point of inflection change more quickly with respect to small changes in the input. Similarly, these functions can take any real number and map all of them to a finite range such as [0, 1] which can have many advantages for downstream calculation.

In this episode, we overview the concept and discuss a few reasons why you might select one function verse another.

  continue reading

546 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide