And then you realize maybe Ayn Rand had a number of things right after all.
No, she really didn't.
What I mean by Rand getting some things right is that she pushed the idea that a lot of history is moved forward by a relative handful of people. And there is some truth to that. In one of his book the philosopher Norman Swartz talks about Newton, the first and only person who had the insight of how bodies attract each other, and wonders if Newton had never lived, whether anyone else would have hit on this insight. Maybe, maybe not. Don’t get me wrong, I’m no Rand fan, but when I see a minuscule number of people inventing artificial eyes while so much of the rest of humanity supports or prosecutes wars, votes for Donald Trump, etc., I feel depressed.
And of course, for peacegirl, artificial sight based on AFFERENT mechanisms isn’t sight at all, because reasons.
I figured this is what you were going for? Thanks for clarifying.
Personally, I aim to create a better, different sort of artificial eye. But this is foundational to proving out theories of operation on the first place.
I actually did a lot of work to place myself in companies that actually put together the theory of function for a sensor technology in preparation for when I eventually got money for a lab, which is
probably going to happen at this point.
I had an interest so regardless of whether to do so is profitable, scalable, "wise", or profitable to build a DIY full stack vision system that takes in the input of a camera and then on the other end outputs frame by frame what the camera "sees".
I don't mind using synthetic data, or can ed data, I don't mind buying ingredients to this machine so long as I know how they are made!
I have had to diligently study this to the level of a systems engineer, mostly on my own because to put it bluntly
I wasn't born to the sort of family whose scions normally get to work on that sort of thing.
The reason a few-channel sensor can only see a vague sensation on a single dimension is because it doesn't carry enough parallel data for that. This is understood well by the people engineering such technologies.
The issue is more... The eye sees at a pretty damn good resolution, and data at that resolution has to make contact with the optic nerve. There's a break-out of that nerve at the eye, but it gets more narrow as it goes through the skull, because nerves can cluster more densely than the complicated structure of cones and rods, and if they fire faster than the cones and rods reset, they can duplex information with sequenced signals to use fewer neurons than cones/rods, and I expect any such process is learned over time by the interaction of the optic nerve with the larger brain.
Getting that much data out in parallel requires some creative thinking and specialized connectors. That's why this little digital packet signaling thing can't let someone see a shadow: the sheer lack of resolution and channels of data.