• Feeling isolated? You're not alone.

    Join 20,000+ people who understand exactly how your day went. Whether you're newly diagnosed, self-identified, or supporting someone you love – this is a space where you don't have to explain yourself.

    Join the Conversation → It's free, anonymous, and supportive.

    As a member, you'll get:

    • A community that actually gets it – no judgment, no explanations needed
    • Private forums for sensitive topics (hidden from search engines)
    • Real-time chat with others who share your experiences
    • Your own blog to document your journey

    You've found your people. Create your free account

Quantum Computers

Raul

Researcher
If anyone is interested about Computers, I can tell you that Microprocessors have reached almost the limit of their arhitecture, 22 nanometers. The more the integrated circuits shrink in size, the more diffcult it will be to acually implement transistors. Transistors, which are not the classical ones used in Operational Amplifiers, but those which can be commanded with the help of a gate, called IGBT's - Insulated Gate Bipolar Transistors. In scientific documents those are called switches because when you give a bit command, for instance 0 or 1, depending on how the user defines the bit : for example 1- true and 0 - false or vice versa, the switch will allow the current to flow or not. As simple as that.


 
Last edited:
Interesting. He says at the end there that this won't mean the end of binary computers for internet-browsing, wordprocessing or even watching HD video. When I first heard of this I thought "exponentially faster" meant faster for simple tasks as well, but it turns out it's only advantageous for executing algorithms that would require a huge number of steps in the binary systems
 
Interesting. He says at the end there that this won't mean the end of binary computers for internet-browsing, wordprocessing or even watching HD video. When I first heard of this I thought "exponentially faster" meant faster for simple tasks as well, but it turns out it's only advantageous for executing algorithms that would require a huge number of steps in the binary systems

Long live binary and the old Von Neumann architect of computer systems !!!! :D
 
Top Bottom