certan on Nostr: The Relu activation function, or any activation function has a simple and single ...
The Relu activation function, or any activation function has a simple and single purpose:
Assign different degrees of plausibility to different inputs.
Input more plausibile to have some characteristic -> bigger number
And between 0 and 1 there are an infinity of numbers.
I guess depending on the sensor some activation functions are better than others.
Published at
2023-02-23 18:11:50Event JSON
{
"id": "d2a5a2e61d0aa3b959efab7bdb9b03114a63daadfb4bac0d467350400cb6e438",
"pubkey": "ae42883f7a0c60fc7abcf9bcc1f6acd92911879ebe1607436015db21ac4602df",
"created_at": 1677175910,
"kind": 1,
"tags": [],
"content": "The Relu activation function, or any activation function has a simple and single purpose:\n\nAssign different degrees of plausibility to different inputs. \n\nInput more plausibile to have some characteristic -\u003e bigger number \nAnd between 0 and 1 there are an infinity of numbers. \n\nI guess depending on the sensor some activation functions are better than others.",
"sig": "bb49284049a3196a7e6629923419f0d5287be16e5553a4c4ca81e5335ccafcaf297abcf4346dd4a5901cdcebd6f755af3c5a27c562ee8a0dd082934ea0ec09b0"
}