Universal key-word search, unlimited
Isn't it true that literally millions, billions, of voices are effectively excluded from Google / Youtube?
Where is it written that Google/Youtube is obligated to platform hate speech?
No, my point is not about dubious sites being identified for something criminal/slanderous and being banned by the platform. Those are a very tiny percent of all the sites. I'm referring to literally millions (maybe hundreds of millions, or even billions) of sites, by legitimate publishers, non-hate-speech publishers who want to say something to whoever out there is interested. They are not targeted, but rather are
EFFECTIVELY excluded (from the key-word search) because the platform has limits -- I'm not sure what the limits are, the technological limits. It might be a cost-limit problem.
There are millions of sites out there (billions?), not banned per se, having a web address / url which could be typed into the address box, and so which are allowed, not nefarious, and you can find them if you happen to know the particular address / url.
But, what if that site has something you want to find, and it's there, perfectly legal and legitimate and accessible to anyone knowing the url ---
but you do not know the url? Why shouldn't you be able to find this site if you type in the needed search words?
Of course you can find some of them, or an unusual site in many cases, but there are far more sites you can't ever find. Suppose you type in 2 or 3 or 4 key words and get several hundred or thousand hits -- the reality is that there are many others, thousands, which you never get offered to you. Eventually the platform starts repeating previous hits again and again, which you've already checked and weren't what you wanted, or weren't good enough. Or the platform finally gives you a note saying there are no others. Which is false. There are hundreds or thousands more sites which answer to the key words you typed in. But there is some artificial limit to the number of sites offered, and all the others get excluded. The only way you can get through to those sites is by typing in the particular url, which you don't know. Because you don't even know the site exists.
I.e., you figure there's probably something out there which answers your search, but you don't know for sure, or don't know the address, and so you search for it by relying on the key words.
For the sake of those searching, and also the site publishers seeking visitors, why shouldn't there be a way the searcher can find that site by continuing on and on, even if it requires many hours, even days, of searching?
The sites are out there, but the platform apparently "runs out" of offers at some point, and so says there's nothing more, or it repeats the earlier offers again and again. Apparently those sites which are repeated had to pay a price of some kind in order to get this special status. Or it's a chance selection process -- maybe the platform takes the first 100 or 500 or 5000 sites which got in, and after that any new ones are excluded.
If the searcher types in 2 or 3 or 4 key words, why can't there be a technology to give that searcher every single site answering to those words, no matter how many? Even if it's millions, why can't it just keep giving more and more, as long as the searcher wants to keep trying? no repetitions of earlier hits? just new ones, on and on?
(Actually there should be a function for the searcher to also go back and find earlier ones (hits), if they remember it, or maybe repeat the same search over again. But also it should be possible to screen out the earlier hits and keep getting new ones, to search farther and farther.)
If it's a cost limit, because of the technology, there should be gov't-funded research to create the technology, and even a gov't-funded platform, so anyone can find any site offering what they're searching for.
This . . . would be like me arguing I was cancelled because I got banned from my local bar because of what I think about the Jews.
No, this isn't about a site being cancelled or banned because they're singled out by the platform as objectionable. It's about
effective exclusion (
de facto exclusion) of sites the platform ignores but which are actually included for the few visitors who happen to know of the site and its web address.