• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

What could be the use of Modern Logic?

So, symbolic logic certainly helps but is it really necessary?

Formal logic is very convenient and helpful, but is it is ever necessary?

Necessary for what?

That's what I'm asking.

"Is it is ever necessary?" You know or you don't. If you know, just give at least one relevant example. No big deal.


You seem to have built up a straw man in your mind over what everyone thinks logic is good for. I don't think that I'm the only commentator here with that impression. Everyone else seems to have trouble grasping your implicit sense of necessity.

I would survive if you just replied as you usually would just using your own notion of necessity.

I'm also open to the idea that I'm wrong as to how much people think formal logic is necessary. No problem. I'm asking a simple question. No need to go ballistic about that.

Logic is a method for calculating truth consistency across a set of propositions. There are important applications for that in mathematics, computer science, philosophy, linguistics, and a host of other fields of study that require precise calculations. These are far from "trivial logical relations".

That's what I would love to see literal examples of. It's one thing to talk about it, it's another to exhibit actual samples. And actual examples is what I asked for in the OP:
So, I'm looking for examples you'd be aware of where modern logic is used for practical applications, and beyond the basics.

Either you think you have them or ou don't.
EB
 
Thanks, Phil, I'll try to have a close look into that. If you had a few links you think could help I would appreciate it!
EB
For popular press links, here's an early one describing the context of the Kepler conjecture, and why machine verification was felt necessary. Here's an update from when the project was completed. For SeL4, here's a link briefly explaining it.

Both projects are open-source, as are the formal verifiers behind them, being HOL Light and Isabelle respectively. I can dig out more links or field questions if there are specific things that pique your interest here.
 
The proof of the Kepler Conjecture might be just what I will need.

You wouldn't have anything a bit more complex, would you? :p

Thanks!
EB
 
The proof of the Kepler Conjecture might be just what I will need.

You wouldn't have anything a bit more complex, would you? :p
:) SeL4 is sold as the bigger project, and it's ongoing. I attended a talk by some of the folk behind it and they wanted to introduce the term "proof engineering", to talk about a situation where the level of complexity was so great that you have to start thinking about rigorous processes above and beyond the logic itself. There are more ambitious projects in the pipeline, and it'll only continue.
 
:) SeL4 is sold as the bigger project, and it's ongoing. I attended a talk by some of the folk behind it and they wanted to introduce the term "proof engineering", to talk about a situation where the level of complexity was so great that you have to start thinking about rigorous processes above and beyond the logic itself. There are more ambitious projects in the pipeline, and it'll only continue.

Wooosh, that sounds exactly the thing! I wish I was a young man! :(

What's the information publicly available on those projects?
EB
 
That stuff was all the rage in the seventies and eighties in SW V&V circles. You might want to read some Cosmides.

Sorry, Dear, somehow I don't seem to speak your language.

Still, have a nice day! :p
EB
 
:) SeL4 is sold as the bigger project, and it's ongoing. I attended a talk by some of the folk behind it and they wanted to introduce the term "proof engineering", to talk about a situation where the level of complexity was so great that you have to start thinking about rigorous processes above and beyond the logic itself. There are more ambitious projects in the pipeline, and it'll only continue.

Wooosh, that sounds exactly the thing! I wish I was a young man! :(

What's the information publicly available on those projects?
EB
This stuff is all open source, and mostly done in academia. Microsoft Research have a verification team doing big scale stuff, with their most famous success being the verification of the Odd Order theorem, which is a milestone towards the classification of all finite simple groups (right now, way too ambitious a project to seriously consider, I think). But their stuff is also open source (Microsoft Research are cool).

Cake/ML is a really exciting project, again open source, with some crossover with SeL4. Those projects strike me as the best ways to get this verification stuff seriously under the hood for software projects.

There's likely plenty of stuff being done in the private sphere too. Intel have been doing formal verification ever since their 1995 floating point bug.
 
The proof of the Kepler Conjecture might be just what I will need.

You wouldn't have anything a bit more complex, would you? :p
:) SeL4 is sold as the bigger project, and it's ongoing. I attended a talk by some of the folk behind it and they wanted to introduce the term "proof engineering", to talk about a situation where the level of complexity was so great that you have to start thinking about rigorous processes above and beyond the logic itself. There are more ambitious projects in the pipeline, and it'll only continue.

In engineering we resort to reductionism. Design reduced to functional modules with well defined interfaces. In electronic circuit design tools it is called Hierarchical Design. A complex design is bone in blocks with signals running between blocks. One block beneath it can have a tree of blocks. I had designs upwards of 20 D size drawings. Impossible to keep every detail on the top of your head.

In software Object Oriented Programming was designed as a reductionist approach to complex software. Reduce complex software with multiple programmers to functional testable modules called objects.
 
Also "system analysis".

Coming from the management and organisation point of view but going down to include technical systems as well, with the same "reductionist", analytical approach.
EB
 
Well, these folk aren't beginners. The SeL4 and CakeML people are sufficiently capable to be not only writing modern microkernels and compilers, but formally verifying them. They're suggesting that the complexities involved in verification are not familiar complexities, and that a lot more experience is going to be needed in figuring out how to do verification at scale.
 
This stuff is all open source, and mostly done in academia. Microsoft Research have a verification team doing big scale stuff, with their most famous success being the verification of the Odd Order theorem, which is a milestone towards the classification of all finite simple groups (right now, way too ambitious a project to seriously consider, I think). But their stuff is also open source (Microsoft Research are cool).

Cake/ML is a really exciting project, again open source, with some crossover with SeL4. Those projects strike me as the best ways to get this verification stuff seriously under the hood for software projects.

There's likely plenty of stuff being done in the private sphere too. Intel have been doing formal verification ever since their 1995 floating point bug.

I was familiar with the kind of formal checking done in the railway industry to proof the safety-related software developed for automatic train driving, essentially here in Paris. But it's already quite a long time ago and I no longer have any connection with these guys. And I don't expect the logic of it to have been too overwhelming.

OK, I'll know it exists and where to look, thanks.
EB
 
Well, these folk aren't beginners. The SeL4 and CakeML people are sufficiently capable to be not only writing modern microkernels and compilers, but formally verifying them. They're suggesting that the complexities involved in verification are not familiar complexities, and that a lot more experience is going to be needed in figuring out how to do verification at scale.

That's really good news! The more, the merrier. And I can wait.
EB
 
This stuff is all open source, and mostly done in academia. Microsoft Research have a verification team doing big scale stuff, with their most famous success being the verification of the Odd Order theorem, which is a milestone towards the classification of all finite simple groups (right now, way too ambitious a project to seriously consider, I think). But their stuff is also open source (Microsoft Research are cool).

Cake/ML is a really exciting project, again open source, with some crossover with SeL4. Those projects strike me as the best ways to get this verification stuff seriously under the hood for software projects.

There's likely plenty of stuff being done in the private sphere too. Intel have been doing formal verification ever since their 1995 floating point bug.

I was familiar with the kind of formal checking done in the railway industry to proof the safety-related software developed for automatic train driving, essentially here in Paris. But it's already quite a long time ago and I no longer have any connection with these guys. And I don't expect the logic of it to have been too overwhelming.

OK, I'll know it exists and where to look, thanks.
EB
Are those the fully unmanned trains? I recall hearing they were pretty aggressively checked.
 
Validating software and reducing unforeseen side effects became a huge problem as software complexity and size grew circa early 80s.

The OOP approach that arose in the early 80s was develop software modules with protected memory and internal functions, and public functions to allow usage of te object. The idea was to individually test and validate an object much like an electronic circuit board. Tested objects are used to build higher level functions.

C++ and Aida are two OOP languages.

Validating software like Windows is difficult. It is not possible to algorithmically test software with other software. Software is validated only to a set of test cases. I was involved in software-functional validation of avionics for Boeing. There are systems with over a million lines of code.

The software complexity problem grew in proportion to the growth in random access memory size. Software grows to fill memory.
 
Damn. That's what I get for using my memory and being a child of more than one discipline while not verifying my facts. I was thinking of  Edward Yourdon,  Larry Constantine of structured design and statisticians  Amos Tversky and  Daniel Kahneman re: v&v when I wrote Cosmidis of Tooby and Cosmides of evolution of mind theory fame.

They all came down on the side of noting the impossibility of exhaustive V&V, the tendency for humans to error in bevity of method, and the power of structured design.

So, have you read your Yourdon and Constantine Speakpigeon dear? I'm sure my previous post makes perfect sense when those two and the two methodological psychologists are referred.

Oh, and as I wrote before I'm not a young person.
 
Are those the fully unmanned trains? I recall hearing they were pretty aggressively checked.

Yes, mass transit. I guess all the big bosses, I suspect up to the transport minister at the time, must have been very, very worried, enough to make it a full-blown case-study on how to do that sort of things. The line has been put into service in 1998. It will be an Oh-my-God 20 years in October, if I remember. I don't recall any incident since and it still looks great! The same guys also worked on New-York's Canarsie line a few years later. Although, this one isn't fully automated. And I will refrain to say what I believe must have been the reason, except to say it's all about human beings.

The maximum error rate allowed was 10^-9 per safety-related equipment unit per functioning hour. They had to come up with a special design for the chips in there:
Coded monoprocessor
http://www.railwaysignalling.eu/safety-critical-software-in-railway-signalling

The basic principle of the Vital Coded Processor consists of a probabilistic approach using redundancy in data, space and time to detect operation errors, operand errors, dating and sequencing errors (e.g. usage of outdated operands). The data processed by the VPC is coded such as every variable (X) of the program is composed of the pair (Xf, Xc), where Xf is the value part and Xc is the control part of the variable. The program instructions are designed to deal with variables in this encoded form for both the input and the output data.

Rather high-tech at the time! :p
EB
 
Damn. That's what I get for using my memory and being a child of more than one discipline while not verifying my facts. I was thinking of  Edward Yourdon,  Larry Constantine of structured design and statisticians  Amos Tversky and  Daniel Kahneman re: v&v when I wrote Cosmidis of Tooby and Cosmides of evolution of mind theory fame.

They all came down on the side of noting the impossibility of exhaustive V&V, the tendency for humans to error in bevity of method, and the power of structured design.

So, have you read your Yourdon and Constantine Speakpigeon dear? I'm sure my previous post makes perfect sense when those two and the two methodological psychologists are referred.

Oh, and as I wrote before I'm not a young person.

I'm sure I could piece it together given enough time and patience but my time comes in short supply and maybe I'm not terribly patient. It's up to you to see how much you want to be understood and by whom.

Me, I would say... you don't seem to care that much. Still, you're in good company with UM and a few others.

And, sure, we all have our problems.

I still love you. :love:
EB
 
An autonomous vehicle recnetly killed a pedestrian.

No amount of testing ensures there are no problems. I know that from working on cpmmercial aircraft electronics. Bugs in the long term do surface, but rigorous testing usualy kepps them frpm being serious.
 
An autonomous vehicle recnetly killed a pedestrian.

No amount of testing ensures there are no problems. I know that from working on cpmmercial aircraft electronics. Bugs in the long term do surface, but rigorous testing usualy kepps them frpm being serious.

Well, there's nonetheless a big difference between an enclosed system like typically mass transit system will be and open systems like cars and aircraft. Mass transit applications are much simpler just because of that. I don't think you could do aircraft safety-related applications with the kind of technology used for line 14 of the mass transit here in Paris. Typically, you'll make the distinction between safety and the rest. The rest can be allowed to fail once in a while, not the safety related. In mass transit, the safety-related software can be made so simple it's doable. For open systems, I don't see how you could possibly cover all the angles. So, failures, and crashes, will occur.

In the case of self-driving cars, the number of casualties wouldn't be expected to get anywhere near what could potentially happen in a mass transit crash but the difference will be no more than a factor of one hundred at best or even as low as ten, which maybe doesn't reduce the overall risk of self-driving cars enough given their likely higher probability of a crash. Crashes will keep happening but they already happen with human beings doing the driving. Ultimately, it's going to be a political decision. The human factor.
EB
 
Back
Top Bottom