• Welcome to the Internet Infidels Discussion Board.

Artificial intelligence paradigm shift

Yeah. Honestly this is the reason I find the 'collage' and 'database' claims so ridiculous.

DeepSeek is offline and 2 gigabytes and the Internet is many, many terabytes of information.

Even Wikipedia, compressed, is bigger than a DeepSeek model, and Wikipedia is hardly comprehensive.

This kind of 'knowledgebase' function is only possible if the system is leveraging general, rather than 'tabled' understandings.
There is a local version that is about 2 Gb but I think the version that did really well in the benchmarks is a lot bigger than that.
I think they have a few different sizes, but the 2b model is quite serviceable, and the whole point is that it has an quasi-internal monologue that you can read as you please.
 
No, you just didn't actually make a valid argument. Some short years ago, CG appeared in animation, looking like sloppy shit,
Are you really making comparison between CG and AI?
AI is a literal 5 year old.
And it is still mostly unsubstantiated hype.
Seriously. It's been few years since they claimed that computer speech recognition is better than human. And yet, when you try to use it it clearly not better than human.

If barbos is using English speech recognition to understand a thick Russian accent, then it probably does do poorly. However, the trick to speech recognition is a program that relies on context to guess the most likely words being pronounced, not just to take the acoustic input at face value. People tend to produce speech with a consistent accent, but speech understanding requires the ability to handle a wide variety of phonetic variation in the speech signal, not to mention morphological and syntactic variation. That is, one can understand accents that one cannot accurately reproduce. So the development of AI, especially including LLMs, has vastly improved the ability of computer programs to transcribe speech. They are pretty good these days at realtime closed captioning of speech in recorded media (for example, on Youtube). Combined with AI-assisted translation, you can even get some reasonably good translations of foreign videos, depending on the language being translated into English. I've been studying machine translation techniques for a long time, and I am really impressed with the quality of translation I am seeing now. I still see a lot of errors, but the translations are serviceable even when there are a lot of errors.
I have seen Alexa get better at understanding my wife but it's still bad enough that if I'm there she'll ask me to give the command.
 
context to guess the most likely words being pronounced, not just to take the acoustic input at face value
Humans have to do as much, as well. Have you ever tried to follow a conversation in Japanese? They roll out the pronouns and context once at the beginning and if you miss it, forget it, or can't match it up at any point, the whole conversation melts into apparent gibberish for the listener.

"Context heavy" I heard it called once?
This is bad enough between what I'll call top down and bottom up languages (I believe there are terms, but I don't recall them.)

My wife is from a top down language. It's in the refrigerator, on the second shelf, on the right, in back, in a box. But if you told her that if she can get it in the next 60 seconds there is a million dollars in the box in the back right corner of the second shelf of the refrigerator it would be very unlikely she would get it in those 60 seconds. An English speaker would have no problem with the description I gave.

I've also had a professor back at the university who so tortured English that I could not understand it in realtime--by the time I figured out one sentence I had lost the next one. Fortunately his knowledge was no better than his English, the only hard part of the class was keeping track of where he was wrong.
 
Because AI systems can and do and are being constantly tuned and developed and it's absolutely idiotic to look at the acceleration from "can barely step together a sentence" to "can code better than Barbos" in 5 years. This implies a velocity, and your view is looking at the state of the earth and saying that it will not continue advancing as such.
But can it really code at all?
Or it simply create an impression of coding by basically taking code from the internet which seems appropriate as a solution?


It can not code at any useful level, it's good for searching for useful snippets of code. It does not think any sense of the word.
It's just very complicated search engine trained to make appearance of intelligence.

You bought into this hype where they say AI is better at driving, better at speech recognition. When in reality it's not even close. It's useful as a search engine in fields where large amount of info need to be operated on. It can generate better than nothing subtitles. It can translate simple news text. But that does not imply thinking even at 5 year old level.
Yup, there's no intelligence there, just a repetition of what it's seen before. In programming terms, the ultimate cargo cult programmer.

I have found Visual Studio is getting pretty good at making suggestions to flesh out code, but that's all. It's the same thing that I've seen for nearly 50 years now: the system does more and more of the scutwork, it never touches the high level stuff.
 
Because AI systems can and do and are being constantly tuned and developed and it's absolutely idiotic to look at the acceleration from "can barely step together a sentence" to "can code better than Barbos" in 5 years. This implies a velocity, and your view is looking at the state of the earth and saying that it will not continue advancing as such.
But can it really code at all?
Or it simply create an impression of coding by basically taking code from the internet which seems appropriate as a solution?


It can not code at any useful level, it's good for searching for useful snippets of code. It does not think any sense of the word.
It's just very complicated search engine trained to make appearance of intelligence.

You bought into this hype where they say AI is better at driving, better at speech recognition. When in reality it's not even close. It's useful as a search engine in fields where large amount of info need to be operated on. It can generate better than nothing subtitles. It can translate simple news text. But that does not imply thinking even at 5 year old level.
Yup, there's no intelligence there, just a repetition of what it's seen before. In programming terms, the ultimate cargo cult programmer.

I have found Visual Studio is getting pretty good at making suggestions to flesh out code, but that's all. It's the same thing that I've seen for nearly 50 years now: the system does more and more of the scutwork, it never touches the high level stuff.
Yet again, this is a really shit take.

I doubt either of you have even tried using any of the good local code models, either.

All of computer science is just the logical repetition of not/and/or. Lots of stuff comes from repetitions of seen stuff.

Perhaps everything in terms of knowledge and science comes from the repetition and deconstruction and reconstruction of the modular components of seen things.
 
I doubt either of you have even tried using any of the good local code models, either.
I have not used any of them. I've watched few youtubes and tried some LLM on the web to code.
It was useless. It's incapable to compose more less complicated code unless it already exists somewhere on the internet.

It's the same with, for example, computer translation. I remember reading about it. People were trying to make it work for ages.
They were trying to make machine to understand meaning and it was not working too well. Then they got an idea to forget about meaning and simply make translation look natural. And this is how it essentially works now. It has large database of everything and anything in every language. All it needs to do is to recognize approximately what is being said and they already have a snippet in other language in best natural language. It works great for news text. Technical texts not so much.
 
Last edited:
I have found Visual Studio is getting pretty good at making suggestions to flesh out code
Yeah, it's remarkable how simple prompting long function call names was not invented 40 years ago.
It would have saved countless keyboards and time.
It only tries to give very generic names that you're expected to edit. But it's pretty good at guessing what existing name I'm referring to.
 
All of computer science is just the logical repetition of not/and/or. Lots of stuff comes from repetitions of seen stuff.
Not a "Lots" in case of decent human programmer and everything in case of LLM.
Hard part of human programming is keeping in the head lots of names and interfaces. But good IDE such as VScode can help with that.
I have found Visual Studio is getting pretty good at making suggestions to flesh out code
Yeah, it's remarkable how simple prompting long function call names was not invented 40 years ago.
It would have saved countless keyboards and time.
It only tries to give very generic names that you're expected to edit. But it's pretty good at guessing what existing name I'm referring to.
What do you mean generic? It looks at what you are typing and gives you a suggestion list which has all (!!!) names which can be possibly typed there. It works great with members and local variables. That's basically 90% of the greatness of VScode&such.
It could and should have been done 40 years ago. But I suspect RAM and CPU constrains made it impractical.
 
I have not used any of them
Wait.

Hold up.

>Says LLMs suck

>Has never actually used one

GTFO

You have no bearing on this conversation, not leg to stand on here, not any such thing.

You are as deserving to post here on the subject as someone who has never read a book or movie is in reviewing it.
 
shared experiences that are not part of the linguistic text
I find that this view takes an overly narrow view of linguistics.

What do you really know about linguistics? Have you ever even taken a course in the subject? I've been in the field for over half a century.


Essentially, each of the components of some experience is itself some phoneme that sums to the overall vector of some communication, and some of it can be baked into intrinsic.

When someone says "I didn't really say..." I can say "no, you did say; these actions and inflection form a statement as clear as any verbal word."

Sometimes the words are difficult to see, and are mediated with other than "sound"? But all things amount to language imputed meaning either indirectly or directly.

Your description of what you think is going on here is barely coherent. You don't need to explain how language works to me. Yes, there are units that speech processing professionals call "phonemes", but they are not the same type of unit that linguists use that term for. And they exist at too low a level to be relevant to textual analysis. They are useful in word recognition tasks, which involves mapping sounds to alphabetic representations of words and morphemes. The phonetic signal is not always present or necessary, if a word can be inferred from other information. People also make hesitations and repetitions that are noise in the acoustic signal, and those need to be filtered out. The key to language understanding is recognition of written words, phrases, and clauses--higher level chunks of text.
 
context to guess the most likely words being pronounced, not just to take the acoustic input at face value
Humans have to do as much, as well. Have you ever tried to follow a conversation in Japanese? They roll out the pronouns and context once at the beginning and if you miss it, forget it, or can't match it up at any point, the whole conversation melts into apparent gibberish for the listener.

"Context heavy" I heard it called once?
This is bad enough between what I'll call top down and bottom up languages (I believe there are terms, but I don't recall them.)

My wife is from a top down language. It's in the refrigerator, on the second shelf, on the right, in back, in a box. But if you told her that if she can get it in the next 60 seconds there is a million dollars in the box in the back right corner of the second shelf of the refrigerator it would be very unlikely she would get it in those 60 seconds. An English speaker would have no problem with the description I gave.

I've also had a professor back at the university who so tortured English that I could not understand it in realtime--by the time I figured out one sentence I had lost the next one. Fortunately his knowledge was no better than his English, the only hard part of the class was keeping track of where he was wrong.

You are absolutely right about differences between English and Chinese speakers when it comes to giving directions. There is a very large literature on this subject of telling people how to get from one location to another. Some cultures prefer to go from landmark to landmark; others purely by orientation.
 
All of computer science is just the logical repetition of not/and/or. Lots of stuff comes from repetitions of seen stuff.
Not a "Lots" in case of decent human programmer and everything in case of LLM.
Hard part of human programming is keeping in the head lots of names and interfaces. But good IDE such as VScode can help with that.
I have found Visual Studio is getting pretty good at making suggestions to flesh out code
Yeah, it's remarkable how simple prompting long function call names was not invented 40 years ago.
It would have saved countless keyboards and time.
It only tries to give very generic names that you're expected to edit. But it's pretty good at guessing what existing name I'm referring to.
What do you mean generic? It looks at what you are typing and gives you a suggestion list which has all (!!!) names which can be possibly typed there. It works great with members and local variables. That's basically 90% of the greatness of VScode&such.
It could and should have been done 40 years ago. But I suspect RAM and CPU constrains made it impractical.
It's had that for a long, long time. I'm talking about more than that. A class had been entirely standalone, due to a change in how part of it was operating it needed access to the object holding the info it read from the config file. As I worked it started offering up the correct change simply from my putting my cursor in the right spot, without typing anything. It clearly recognized that I was changing the object's signature and correctly guessing how to implement it. It puts a ghost in the code at the cursor, hit tab and the ghost becomes real. Do anything else, it vanishes. This is not simply autocomplete!
 
Back
Top Bottom