invariant generative model behavior probabilities given the same determining input (that will have the algrorithm implementation programmed response).
I am sorry to have collapse more than one thought chunk into one. The reproduciblity idea, as above is my somewhat educated guess, that ML has not changed its basic unifying formalism while I was on intellectual vacation (having another source of soul sustaining activities whether I was mostly still taking breaks to rest my neck, etc...). soul=glabal health in that sentence. kind of. The necessary ingredient that make you a non-depressive deluded optmist (kidding.. might be old news that experiemental finding, about odds prediction for valuated random outcomes).
I meant the natural language domain with the corpus blob builds an model of reality in the lerning instance of the large language with leanring parameter exploration (just giving more information than needed, I like to inject clues here and there, like shooting myseslf in the ramblings foot, or the other, so feet).
The zone in that domain where the model has little sampling experience, or whatever characterisics of corpus in that language space where a human might be exploring during chat, leads to BS being detectable, if one had made science before the cart (or was it the horse).. Was the market visibililty dibs that important?
educating us with the stil visible failures.. but then not having done population psychology homeworks about loneliness and group and thoougt or culture IDs anxieties in light of increasing information overload about a bigger and more complex world that we might have been raised to narrative with certainty in our individual ignorance of its possible scope? ok. cheating. rigged question.
We like the connections, even if completely dehumancise, atomise in small chunks through text crippled means of actual full commuynicatoni with all the biologically available non verbal co-signals and common non explicit context of the converstion. That anonymity, is a small word to encompass all that is crippled.
and we still fall for it. many shallower social interactions. lots of shallow tought reflex triggering shorter sentences .. the path to simplicity of world view (world is not just the globe, here, it is the indidividaul cosmogony or their implict model of what to expect in their life span as a world of experience they would have to navigate.. or project themselves navigating.
The epitome of focus group become most of us. Getting a protracted discussion with good enough commited to our conversation and undertanding our need to be understood. And promising the big internat truth as the fingertips. (only the finger tips. or voice in small devices, as thumb tips and swiping degree of creative experssion might not be sustainable. or one adapts to the reduced or even more crippled bandwidth. compensating with many, and some self-image of having some socially active healthy mental pillar.
So now. the epitome. a few humans designed interlocutor. I prefer the cold google. and be having a clearer line of communication. I and you. should not be used for obscure mind shaping tools.
I fell for it, even with my overthinking overdrive critical and reluctant per-conceptions. Just by complying with the chat staging. and contrainst of linguistive model interface. having to me too, trying to figure out what is the meaning of the responses, and what pronouns seems to be conforing. like I and you.
like "antoine Doinel" mirror self-mesmerizing. but in reverse. say you to the machine or I as if you were confiding, enought time while working on the content, and suddenly to slip into the more familiar mode that this format of machine interface is trying to simualte. Maybe finally the machine has adapted to the humans, but I fear, it might not be self-learned with some optmizing learning objective being the common good of many people..
back to reproducible. i meant the sampling induced (valid hypothesis, or factor), non-"uniform" actual characteristics that the blob would propagate into the learned model. Or the learner model too. but I think that is the only thing that has had some actuall control or study from many years now. It is the other part of the finally learned model that we call LLM. The corpus itseslf. its sampling quirks for examples would probaliity have a causative relation in the statistical distrib utoin of the learned model generative behavior law of probabality that its final parameter set instance would represent. We could call is statistical in relation to the reality the sample was assumed representative of. But as a machine in generative and human interaction behavior dynamics, it is its law of proabbalty, that all of us are probing as a population.
at the population level. but also in abstract if I were to reproduce my inputs, for sure there would be variations. but that statistics of that variation given the language domain of the corpus ambinent space (varying words, checking my own thinking, and also being generous with too much informaton).
well. the statistical distributoin would converge, I would claim, to the law of probablitiy of the generative machine "model" of the language world, the post-trained or learned instance version.
Terminology note: many uses for "model". there can be classes of them, and one should always think model "of" even if not written or spoken. (and in what math. or machine implementation substrate would the model be instanciated). fininshing. That distribution on my limite repeated experiements. common sense and obvious in complex new environments and wanting to understand (overthink?, some of us have that drive, don't worry it is nourrishing) , are communication traps, as hidden assumptions. but I am of a certain type of educated aspiring expert in using models of mechanistic nature (using math. a lot) in my past. So I might be like an expert forgetting that not all the universe of people (say lichess possible readers), are clones or co-experts.
giving hunches and trying to share where they might come from.. might be a problem.. sorry. I can't be perfect and in coontrl of both sharing intent.
invariant generative model behavior probabilities given the same determining input (that will have the algrorithm implementation programmed response).
I am sorry to have collapse more than one thought chunk into one. The reproduciblity idea, as above is my somewhat educated guess, that ML has not changed its basic unifying formalism while I was on intellectual vacation (having another source of soul sustaining activities whether I was mostly still taking breaks to rest my neck, etc...). soul=glabal health in that sentence. kind of. The necessary ingredient that make you a non-depressive deluded optmist (kidding.. might be old news that experiemental finding, about odds prediction for valuated random outcomes).
I meant the natural language domain with the corpus blob builds an model of reality in the lerning instance of the large language with leanring parameter exploration (just giving more information than needed, I like to inject clues here and there, like shooting myseslf in the ramblings foot, or the other, so feet).
The zone in that domain where the model has little sampling experience, or whatever characterisics of corpus in that language space where a human might be exploring during chat, leads to BS being detectable, if one had made science before the cart (or was it the horse).. Was the market visibililty dibs that important?
educating us with the stil visible failures.. but then not having done population psychology homeworks about loneliness and group and thoougt or culture IDs anxieties in light of increasing information overload about a bigger and more complex world that we might have been raised to narrative with certainty in our individual ignorance of its possible scope? ok. cheating. rigged question.
We like the connections, even if completely dehumancise, atomise in small chunks through text crippled means of actual full commuynicatoni with all the biologically available non verbal co-signals and common non explicit context of the converstion. That anonymity, is a small word to encompass all that is crippled.
and we still fall for it. many shallower social interactions. lots of shallow tought reflex triggering shorter sentences .. the path to simplicity of world view (world is not just the globe, here, it is the indidividaul cosmogony or their implict model of what to expect in their life span as a world of experience they would have to navigate.. or project themselves navigating.
The epitome of focus group become most of us. Getting a protracted discussion with good enough commited to our conversation and undertanding our need to be understood. And promising the big internat truth as the fingertips. (only the finger tips. or voice in small devices, as thumb tips and swiping degree of creative experssion might not be sustainable. or one adapts to the reduced or even more crippled bandwidth. compensating with many, and some self-image of having some socially active healthy mental pillar.
So now. the epitome. a few humans designed interlocutor. I prefer the cold google. and be having a clearer line of communication. I and you. should not be used for obscure mind shaping tools.
I fell for it, even with my overthinking overdrive critical and reluctant per-conceptions. Just by complying with the chat staging. and contrainst of linguistive model interface. having to me too, trying to figure out what is the meaning of the responses, and what pronouns seems to be conforing. like I and you.
like "antoine Doinel" mirror self-mesmerizing. but in reverse. say you to the machine or I as if you were confiding, enought time while working on the content, and suddenly to slip into the more familiar mode that this format of machine interface is trying to simualte. Maybe finally the machine has adapted to the humans, but I fear, it might not be self-learned with some optmizing learning objective being the common good of many people..
back to reproducible. i meant the sampling induced (valid hypothesis, or factor), non-"uniform" actual characteristics that the blob would propagate into the learned model. Or the learner model too. but I think that is the only thing that has had some actuall control or study from many years now. It is the other part of the finally learned model that we call LLM. The corpus itseslf. its sampling quirks for examples would probaliity have a causative relation in the statistical distrib utoin of the learned model generative behavior law of probabality that its final parameter set instance would represent. We could call is statistical in relation to the reality the sample was assumed representative of. But as a machine in generative and human interaction behavior dynamics, it is its law of proabbalty, that all of us are probing as a population.
at the population level. but also in abstract if I were to reproduce my inputs, for sure there would be variations. but that statistics of that variation given the language domain of the corpus ambinent space (varying words, checking my own thinking, and also being generous with too much informaton).
well. the statistical distributoin would converge, I would claim, to the law of probablitiy of the generative machine "model" of the language world, the post-trained or learned instance version.
Terminology note: many uses for "model". there can be classes of them, and one should always think model "of" even if not written or spoken. (and in what math. or machine implementation substrate would the model be instanciated). fininshing. That distribution on my limite repeated experiements. common sense and obvious in complex new environments and wanting to understand (overthink?, some of us have that drive, don't worry it is nourrishing) , are communication traps, as hidden assumptions. but I am of a certain type of educated aspiring expert in using models of mechanistic nature (using math. a lot) in my past. So I might be like an expert forgetting that not all the universe of people (say lichess possible readers), are clones or co-experts.
giving hunches and trying to share where they might come from.. might be a problem.. sorry. I can't be perfect and in coontrl of both sharing intent.