>>7098
>We should try thinking outside the box to speed up learning.
OK, here's my take:
As humans, we are innately wired for language in the very design of our neural tissue, probably even into the realm of the design of our DNA. This is part of us created inside us before our births even. Thereafter, as young children we begin picking up language by parroting our parents/siblings/etc. and gauging the responses socially-speaking. All of these training shots stored in our near & long-term memories are also guided by the context environmentally-speaking.
In other words, you learn very young the right things to say and the right times to say them based on purely external factors. Parent's auto-selecting 'baby-talk' and then progressively more complex may be a factor in how quickly we progress.
But it's the astounding facility of the imaginations we possess that allow us to begin to quickly internalize these lessons, and then to creatively reassemble them into novel ideas (and effectively thence into an increased vocabulary) is probably an important key to understanding what differentiates us from mere animals.
A materialst view that we are nothing but a mere specialization of them will be fruitless IMO, we are simply qualitatively different. This is part of what sets us apart, and it will behoove us in our efforts here to try to unravel what this distinction is and how to capitalize on it.
>perplexity
So just to check if I'm getting some idea of the meaning, a
lower score is indicative of more accuracy inside the model?
>so I started training it on C++ and Python code instead
<[desire to know more intensifies]*
>The approach I take to learning Japanese is by learning the most frequent words first because it's the biggest return on investment and I can guess the meaning of most sentences even though I don't know that many words.
Seems sensible on the surface of it to me.
>This is the kind of data we want to find.
<again, gibb more details plox.
>This is the kind of data we want to find. It gives me an idea to build a 'curative mentor network' that tests a student network to figure out which data the student needs to improve rapidly, rather than only attempting to invent new helpful data like a generative teaching network.
This sounds both interesting and innovative, but I have little idea what that even means, much less how it would be implemented.
I just hope we can create robowaifus than can interact immersively with us using nothing but their internal, mobile, onboard computation systems.
Just like on my favorite Chinese Cartoons.