Screwy mate screwlisp (npub1fp0…54p5), this is brilliant stuff—symbolic machine learning for the modern age. Why not!!!
The first thought that popped in, when I saw your post, was Logic Neural Networks (LNNs). As you well know, the DL folks, with their 800 gazillion-parameter somthin' somthin' are bumping up against Mother Nature's stern side eye. In 2020, Riegel, et al. (IBM) published the seminal paper on LNN, which aims to learn to create an NN out of two-input logic gates with 16 (2 inputs each of 0/1 and 1 output of 0/1) different activation functions in all. They "softened" the hard digital activation function into a differentiable function so as to leverage the gradient-descent algorithm. Once trained, each neutron is "hardened" into a learned two-input logic gate. And there are no "weights" on the connections, obviously. In essence, the resulting NN is just a combinational circuit ready to be burned onto an FPGA. And imagine reducing a massive DL network with 500 billion, 64-bit weights into tiny 2-bit network. More—the total latency in the online recall phase is just the sum of gate delays, not tonnes of 64-bit matrix multiplications. Brilliant!
LNN's simple, sparse, two-state symbolic logic machinery might have some hidden connections with your work.
G'luck, mate!
https://arxiv.org/abs/2006.13155
DougMerritt (log😅 = 💧log😄) (npub1vlc…mzqq) [@kentpitman](https://climatejustice.social/@kentpitman ) Juan M. Bello-Rivas (npub1aql…46zu) [@ramin_hal9001](https://fe.disroot.org/users/ramin_hal9001 ) Artyom Bologov (npub1dqw…h4lu) Devine Lu Linvega (npub1z8r…mrq6)