We have heard a great deal about gender being a spectrum, here Robert Sapolksy talks about the brain being on that same gender spectrum! ©Robert Sapolsky 2021
Robert Morris Sapolsky (born April 6, 1957) is an American
neuroendocrinology researcher and author. He is currently a professor of
biology, and professor of
neurology and neurological sciences and, by courtesy,
neurosurgery, at
Stanford University. In addition, he is a research associate at the
National Museums of Kenya.
Here he speaks about the gender aspect of the brains of mammals.
To be fair, behaviorally, we are "our brains". So really this is giving a materialistic observable reality to the fact of "brain sex".
Might I ask does he have a paper or any written text on this to approach?
We are our bodies, including our brains. Once we are adults our bodies somewhat show what we have made of ourselves.
Well, you might consider this to be true, I am not "my body". I recognize I'm not even my whole brain. I'm impacted by those things, sure, but I could end up with ALS and still be getting on well enough. "My body" is just the life support system of "my brain" most of which is just a bunch of 'external to myself' capabilities that happen to be rubbing up very closely and responsively to the part that IS 'me'.
This is not, to my understanding, a very good description of the brain
or the body. The brain is only the hub of a much larger nervous (and to a lesser but important extent cardiovascular) system, non-partible with most of the rest of "the body". Absent a body, the brain has no obvious function nor any ability to function.
Then your understanding is wrong.
Neural groups are not "non-partable". Sometimes where there boundaries are, are quite messy, but any neural group is going to have "layers" and "surfaces".
Granted this is going to be a description largely of how artificial neural groups are managed, but bear in mind the management models for HTMs, "Hierarchal temporal memories", the best digital model for our own neurons, is reverse engineered from how they are organized in our own brains
Layers are groups of neurons stacked right on top of the next, and they "surface" against layers. But between "stacks" there are additional "surfaces" which then direct at whole other "stacks"
Various "stacks" have a number of input surfaces, and some number of output surfaces, and on these surfaces you get what are called "sparse data representations" of data.
To one region of the brain, the inputs are "the surface of the optic nerve entering this stack". To the next layer, the initial layer's surface presents BOTH the initial image from the first surface AND an interpretation of what elements exist in it and roughly where. As you move down the hierarchy you get layers of interpretation added, but eventually that data has to go somewhere not just to be "interpreted", but to be ranked on it's interpretation, and feedback presented to the interpreter system: it ends up going somewhere.
But moreover not all data goes in all interpretations to all places. Stacks near the optic nerve generally don't get much of the "noise" of the audio (short of being on a powerful psychedelic, or synesthesia), and eventually all that interpreted data gets collated, and either resolved into an element of a "will", or a memory (for later use in the formation of "will").
But at some point, my expectation is that there are a number of tightly bound systems that together are "the me I experience". There is clearly a lot of me that is not "the me I experience".
I can chop off legs. I can chop off arms. I expect I could be brain-in-jar. I expect I could be piece-of-braon in jar.
How much of the "external to my seat" neurons, stacks, surfaces, could I lose before I wasn't me anymore? Well, I intend on doing that experiment some day, and intend on doing it to myself.
Sorry rest-of-my-body, them's the breaks.
As it is, there are active things in my own head that are very much notably "not directly me".
I know already that I am surrounded in my own meat by "unreliable interpreters" and "needy fucking children", both of which have learned that being cagey with "me" gets them very little, and both of which have all their work double- and triple-checked and "Common lies and malarkey" assumed and weeded out.
But they are there, and they are very apparently opaque, and I still have to deal with that bullshit flowing in on a "constrained surface".
Further, in the engineering of automations and control systems there is always a control hierarchy. At some point there is a central "script" which is responsible for behavior, and usually that system is a fairly slow, extremely mutable system with constrained access to things. And it's not "the whole data processing platform" (the person) or even "the whole environment" (the brain) though much of those things exist to either fulfill or support or execute on or in our cases write -- though sometimes that is an automatic function of -- "the script", the thing that actually natively describes and created in the most formal way "that which the system is doing".
All the rest of it, "how it does", and "what it does it to", in the vast majority of systems, is present and identifiable.
I daresay in a running system without any kind of debug symbols, no matter what you would be looking at, it would be hard to pick that out.
I also daresay that in any evolved system, there is a survival benefit to it being rather difficult to isolate and damage.
The endocrine system is important in a lot of ways. It tips balances which cause neural groups to start producing signals. If those signals are not produced or not noticed (these are in fact "sensory data" of a form...), then the brain, regardless of the activity of the endocrine system, is ignorant of it and it will drive zero behavior.
It is a driver of behavior in a very "fuzzy logic" sort of way.
I expect it's set up that way because if it weren't, the agent would be able to tweak it, and tweaking around with the "eat" alarm (or any of the more exotic alarms on endocrine activity), or the goad which inflames the "horny" would make for some rather problematic results in most cases.
It's clear that those inputs on the system can't be left "floating" or "absent", but those are inputs, not the agent itself.
The agent is still most certainly neural, but it's kind of narrow to think that systems that fundamentally show patterns of constraint have unconstrained access to everything.