• Welcome to Autism Forums, a friendly forum to discuss Aspergers Syndrome, Autism, High Functioning Autism and related conditions.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Private Member only forums for more serious discussions that you may wish to not have guests or search engines access to.
    • Your very own blog. Write about anything you like on your own individual blog.

    We hope to see you as a part of our community soon! Please also check us out @ https://www.twitter.com/aspiescentral

Hardware and Software

Telepath John

Active Member
Hello,

I have worked in the computer industry since 1961. At that time most business information processing was done on unit record equipment.

I like using assembler programming languages. I have used FAP (FORTRAN Assembly Programming), BAL (Basic Assembler Language), and Flat Assembler more recently. There are not popular languages because of the user interface for the programmer. My idea is that using a browser interface would make this deep level of coding more practical.

I was doing programming in C a few years ago when I realized that the GNU C compiler had been compromised to prevent that dreaded Divide by Zero error. The C language will not allow a value to be reduced to zero to prevent that error. However if you want to read values from a file, the offset value for the first file record is Zero. So you need to write extra code to read file records. I stopped using C at that point.

The computers we use today are not stable. The fatal divide by zero error is just one example. I am not informed about the chips we use in smart phones. Maybe that are designed better. Most of my experience has been in developing business software applications. The computers we use for that application are the ones we need to better engineer.

My understanding is that Run Time Software Libraries would make programing more efficient. There would be no need to import libraries, or put duplicate code into various libraries, for very common routines like converting an ASCII character to a binary value. I once worked with a software language called DBL which allowed a run time library. No one seems to know how to build that kind of library for today's popular languages.

The other technique I liked to use was memory resident file systems. In Linux you can make some files memory resident. The record locking for updates would be memory pages. The size of all fixed records would be such that they fit into a page of memory.

I realize that current day data base systems are far removed from these techniques. Yet we cling to them as if what we are doing is actually good technology. It is not in my view.

I suspect we are 20-30 years away from making stable hardware and reasonably good software products. I would like to influence that speculation so that the time frame can be reduced to 10-20 years. The companies which dominate this industry are not interested in change. Yet change to stability and productivity can not be prevented indefinitely.

I post this thread in this forum because I believe the autistic person can think in more elevated ways. My hope is that this belief can be validated.

John
 
I would like to influence that speculation so that the time frame can be reduced to 10-20 years.
How would you influence things to reduce the timeframe?
The companies which dominate this industry are not interested in change. Yet change to stability and productivity can not be prevented indefinitely.
This is true. They make interesting products, but offer too many upgrades too soon, generating the 'must have the latest thing' in people when what they already have is fine. Putting new tech out, and software, that is buggy, and then using early adopters as unpaid beta testers to notice so they can finish it off. :rolleyes:

The problem is there's so much money to be made, and the need to keep growing and innovating. It's what they do. Hopefully in 10 years it will be better. I suppose in many ways it's not that much better from 10 years ago. Just smaller, faster, but essentially the same. Perhaps quantum computing will be the big leap forward.
 
How would you influence things to reduce the timeframe?

This is true. They make interesting products, but offer too many upgrades too soon, generating the 'must have the latest thing' in people when what they already have is fine. Putting new tech out, and software, that is buggy, and then using early adopters as unpaid beta testers to notice so they can finish it off. :rolleyes:

The problem is there's so much money to be made, and the need to keep growing and innovating. It's what they do. Hopefully in 10 years it will be better. I suppose in many ways it's not that much better from 10 years ago. Just smaller, faster, but essentially the same. Perhaps quantum computing will be the big leap forward.
I find it is best to influence change with kindness and understanding for how things are. The true innovators are the people who design chips and write code. They are the ones who can evolve what we do in these areas of activity.
All companies need to make money to create a better more stable economy. It is the greed and the short term thinking which will eventually change. We are all taught that economic principal about scarcity of resources. The law of abundance and prosperity for all has yet to be recognized and taught. The simple reality is that when all are prosperous they will be able to buy more things. There will be global economic stability. That is a good thing for big business they truly do not yet understand.
My understanding is that the #1 first step for human evolution is this global economic stability. At the moment we do not see that principal as essential. Some do but most do not.
Speaking about it in a good way should diminish the ignorance and encourage deeper reflection.
John
 
Unfortunately, as long as companies like Microsoft and Apple are allowed to run rampant, things will never reach the point of "stable". The guys at the top of each company not only dont understand these tech concepts (they're business executives, not programmers or engineers, after all), they also dont CARE. All that matters to them is that they stay ultra-rich, and keep getting even richer... nothing more. That's how big corporations are.

Personally I'd be happy to just get away from freaking Windows. I'm tired of putting out the endless braindead fires that it loves to create.

*ahem* Aaaaaaaaanyway, the other issue of course is getting more people to be, well, into this stuff. It's hard for people to get into things like programming and whatnot... who knows how many potentially brilliant individuals are out there who COULD make something truly amazing, but simply havent because of the nasty barrier to entry?

I mean, that stuff is intimidating. I've always had sort of an interest myself... used to mess around with good ol' QBASIC as a kid, and I've also had some game dev experience (that was mostly scripting though), but the idea of REALLY learning programming? Rough. Wouldnt even know where to begin, wouldnt know where to GO with it, and even if I did, I've zero ability when it comes to math. So, I've never tried. I'm betting there's a heck of a lot of others who are in THAT boat right along with me.
 
@Misery I do love how your personality comes so clearly through in your writing. :)

I've had an interest in computing ever since I was young, when the first computers I was programming in BASIC was the VIC20 and Sinclair ZX Spectrum. I have made a few websites before, but not directly coded, more like using graphic design software. It's still pretty technical in the sense of how to bring everything together but it doesn't involve having to literally write the code, but if there is something that it doesn't have that requires code, snippets can be found on the Internet anyway. I haven't done it in ages but I've been quite happy with the results. It's always a steep learning curve for me, and if I haven't done it for a while it's like I have to start again as I remember very little. :rolleyes:

I'm fascinated with the idea of quantum computing. How a Qbit can be on and off, zero and one, at the same time. Mind boggling.
 
Hello,

I have worked in the computer industry since 1961. At that time most business information processing was done on unit record equipment.

Nice to come across someone who beat me by a year! You must have been military, though.
One big problem is that object orientation got hung up with object hybridisation and has never really got past that. As a result, the modularity you talk about is compromised if concept upgrades change the granularity or features of its objects, allowing exploitable gaps. Coders seem to be spending more and more time tweaking as a result. We'd do better if encapsulation were more rigourously enforced.
That in turn shows compilers haven't been written with mastery of the subject implied. About the first thing Bell Labs did was raid previous work for good ideas (immortalising me by accident in the process!) showing Darwinistic generation rather than saying "This is how to do it", and causing the issues you raise. The fix has been more code dialects than I care to think of, rather than complete rebuilds: we might otherwise be at C v200 by now!
 
@Misery
I'm fascinated with the idea of quantum computing. How a Qbit can be on and off, zero and one, at the same time. Mind boggling.

Carlo Rovelli's Helgoland is on the subject right now. I'm waiting for one of our number to settle to inderterminate state and start to hack the things.

For the layman, it's all to do with ideas on ignorance. We're no longer in a binary true/false dichotomy, but one where divide-by-zero indeterminance needs to be recognised as a distinct form (possibly a tagged numerator, or a fraction couplet where the denominator's always 0), alongside "undefined", volatile/dynamic, and quite possibly APL-style vector feeds. The perticacy is in Rovelli's first chapter, describing Werner Heisenberg's technique as matrix yet remaining stuck in the equation formats used for the last 50 years, whereas we need a more flexible value matrix (the RHS of the equation) allowing for inequalities and probabilities.
 
I do love how your personality comes so clearly through in your writing. :)

I've always blamed it on all the Garfield comics I grew up with.

I'm fascinated with the idea of quantum computing. How a Qbit can be on and off, zero and one, at the same time. Mind boggling.

This is one of those things I just try not to think about. Frankly there's already enough quantum lunacy around without these qbit things flying all over the place or whatever it is they do. Freaking R'leyh makes more sense. Well, okay, maybe not.

Also I rather dread the idea of some sort of, I dunno, Quantum Windows. What a horrifying concept.
 
Personally I'd be happy to just get away from freaking Windows. I'm tired of putting out the endless braindead fires that it loves to create.

I use MX Linux. It is the most popular Linux desktop because it is user friendly.

You would want to try something like this out before migrating all your personal stuff to another platform.

John
 
A dedicated software development team mlsdev.com/blog/it-outsourcing-companies is comprised of all specialists required for your project. Instead of hiring individual specialists for specific tasks, a dedicated software development team includes a project manager, lead engineer, front-end and back-end developers, quality analysts, and support staff. The process of hiring a dedicated software development team is also much easier and more cost-effective. In addition to a dedicated developer, a dedicated maintenance team is composed of project managers, quality analysts, and support staff.
 
Last edited:
How would you influence things to reduce the timeframe?

This is true. They make interesting products, but offer too many upgrades too soon, generating the 'must have the latest thing' in people when what they already have is fine. Putting new tech out, and software, that is buggy, and then using early adopters as unpaid beta testers to notice so they can finish it off. :rolleyes:

The problem is there's so much money to be made, and the need to keep growing and innovating. It's what they do. Hopefully in 10 years it will be better. I suppose in many ways it's not that much better from 10 years ago. Just smaller, faster, but essentially the same. Perhaps quantum computing will be the big leap forward.
Anybody view the Sunday, December 3, 2023 '60 Minutes' story on quantum computing?

The '60 Minutes' story discussed both development of quantum computing TECH. in California, as well as applied applications with the 'Cleveland Clinic.'

How might quantum computing assist with the more mundane things -- two examples being, assisting people concerned with the Autism Spectrum? Social/economic development?
 

New Threads

Top Bottom