Options

Analogue...Digital...what next?

245

Comments

  • Options
    gemma-the-huskygemma-the-husky Posts: 18,116
    Forum Member
    ✭✭
    All this quantum stuff sounds like so much snake oil to me. There is only analogue and digital.

    Ie
    analogue - real
    Digital - a means of processing analogue
  • Options
    zx50zx50 Posts: 91,291
    Forum Member
    ✭✭✭
    Si_Crewe wrote: »
    Admittedly, my example of a home PC was a bit of an exaggeration.

    The idea of a biological chip is very interesting though.
    If a computer system has the ability to expand it's capabilities when it needs to then you've got what's probably one of the fundamental requirements for AI.
    You can't really do that with manufactured chips because they'll always have pre-determined limitations.
    If you can create a "chip" which can be grown organically then it should be able to expand as required.

    Course, that'd probably put a big dent in the market for computer upgrades. ;-)

    Which is why it will most likely not happen.
  • Options
    zx50zx50 Posts: 91,291
    Forum Member
    ✭✭✭
    gomezz wrote: »
    But it could eliminate the 0.3 second delay from the eye seeing to the brain "seeing".

    That's a long time to wait. :D
  • Options
    zx50zx50 Posts: 91,291
    Forum Member
    ✭✭✭
    All this quantum stuff sounds like so much snake oil to me. There is only analogue and digital.

    Ie
    analogue - real
    Digital - a means of processing analogue

    If I understand it correctly, a quantum processor would be able to process more information at a time than a processor that processes digitally. A digital processor processes one bit at a time whereas a quantum processor, this would process multiple bits at a time. Digital information is completely different to analogue information. Digital information is created these days without even converting from analogue. Digital information is [0101101000101] whereas analogue is different in some way.
  • Options
    bobcarbobcar Posts: 19,424
    Forum Member
    ✭✭
    zx50 wrote: »
    Digital information is created these days without even converting from analogue.

    Digital always has to be converted from analogue if it's representing real world information. Whether say typing on a keyboard or doing CGI is converting from analogue is one for the philosophers but whatever "these days" are no different to "those days" in this respect other than there being a lot more digital about.
    Digital information is [0101101000101] whereas analogue is different in some way.

    Analogue is the real world so if you say measure distance you can say the distance is 17m but if you measure a bit better you get 17.3m or a bit better you get 17.32m etc, what you get is a continuous system (okay at some point you get into quantum effects).

    Digital just means you've converted the real world into numbers which can then be processed by a digital system such as a digital computer (rather than analogue computer), blu-ray player etc. Digital does not have to be binary though though it usually is - a typical exception would be a modern modem signal that contains digital of different bases.

    Apart from ease of processing digital has the advantage of being able to be copied or transferred with no errors whereas analogue copying always has errors. Also digital is more efficiently compressed which is vital for modern TV etc.
  • Options
    zx50zx50 Posts: 91,291
    Forum Member
    ✭✭✭
    bobcar wrote: »
    Digital always has to be converted from analogue if it's representing real world information. Whether say typing on a keyboard or doing CGI is converting from analogue is one for the philosophers but whatever "these days" are no different to "those days" in this respect other than there being a lot more digital about.



    Analogue is the real world so if you say measure distance you can say the distance is 17m but if you measure a bit better you get 17.3m or a bit better you get 17.32m etc, what you get is a continuous system (okay at some point you get into quantum effects).

    Digital just means you've converted the real world into numbers which can then be processed by a digital system such as a digital computer (rather than analogue computer), blu-ray player etc. Digital does not have to be binary though though it usually is - a typical exception would be a modern modem signal that contains digital of different bases.

    Apart from ease of processing digital has the advantage of being able to be copied or transferred with no errors whereas analogue copying always has errors. Also digital is more efficiently compressed which is vital for modern TV etc.

    What, so you're saying that even in 2014, film cameras used to record programmes etc are analogue? I find that difficult to believe.
  • Options
    and101and101 Posts: 2,688
    Forum Member
    ✭✭✭
    zx50 wrote: »
    What, so you're saying that even in 2014, film cameras used to record programmes etc are analogue? I find that difficult to believe.

    The sensor in the camera is analogue. An analogue to digital converter is used to convert the analogue light level into a digital number which is then sent to the processor.

    Digital cameras typically only record 8 bits or 256 light levels per colour channel which is why digital photos never have as much colour depth or contrast as real life. You can get cameras that record at higher bit depths and resolution but they are expensive and need fast computers to work with the images as a single photo can be several gigabytes in size.
  • Options
    chrisjrchrisjr Posts: 33,282
    Forum Member
    ✭✭✭
    zx50 wrote: »
    What, so you're saying that even in 2014, film cameras used to record programmes etc are analogue? I find that difficult to believe.

    The interface between the real world and the "digital" world has to be analogue because the real world is analogue.

    So the image sensor in a camera produces an analogue electrical signal from the light hitting it. It is very soon converted into digital however.

    You can also get "digital" microphones to record the sound alongside your digital camera. But the bit of the mic that the sound waves hit first is analogue.

    So it might not be a very large part of the signal chain but it all starts out as analogue. And of course all ends up as analogue in your living room. :)
  • Options
    zx50zx50 Posts: 91,291
    Forum Member
    ✭✭✭
    and101 wrote: »
    The sensor in the camera is analogue. An analogue to digital converter is used to convert the analogue light level into a digital number which is then sent to the processor.

    Digital cameras typically only record 8 bits or 256 light levels per colour channel which is why digital photos never have as much colour depth or contrast as real life. You can get cameras that record at higher bit depths and resolution but they are expensive and need fast computers to work with the images as a single photo can be several gigabytes in size.

    Unless it's not possible to create a fully digital recording device then.
  • Options
    and101and101 Posts: 2,688
    Forum Member
    ✭✭✭
    zx50 wrote: »
    Unless it's not possible to create a fully digital recording device then.

    No, the world is analogue so if you want to make a recording you will have to convert the analogue world into a digital stream of information.
  • Options
    AnachronyAnachrony Posts: 2,757
    Forum Member
    ✭✭✭
    zx50 wrote: »
    A digital processor processes one bit at a time whereas a quantum processor, this would process multiple bits at a time.

    Quantum computers operate on quantum bits rather than bits. A qubit can be both 0 and 1 at the same time, a fundamental difference. The transformative power of quantum computing is that when you combine enough quantum bits you can solve calculations for an extremely large number of values simultaneously. Exponentially more and more of them, rather than linearly more like you get with a more powerful digital computer.

    An example of an application that everyone wants this capability for is to break current encryption schemes. With certain current encryption schemes, using a very fast digital processor, it could take millions of years to try every combination and crack someone's strong encryption. But with a decent quantum computer you could try out all the possibilities simultaneously and arrive at the answer almost immediately. It would require coming up with entirely new encryption technologies to stay ahead of that curve.

    Medical applications are another example, not as straightforward, but potentially much more important. Trying to simulate interactions of very complex molecules is very computationally intensive. A quantum computer's ability to work through all the combinations could revolutionize computational biology and such, which could mean major steps forward in a number of important areas that would impact our everyday lives.

    That's the theory, anyway. How practical it will be to implement this kind of computer in real life is still up for debate. The current prototypes aren't yet very compelling.
  • Options
    zx50zx50 Posts: 91,291
    Forum Member
    ✭✭✭
    Anachrony wrote: »
    Quantum computers operate on quantum bits rather than bits. A qubit can be both 0 and 1 at the same time, a fundamental difference. The transformative power of quantum computing is that when you combine enough quantum bits you can solve calculations for an extremely large number of values simultaneously. An example of an application that everyone wants this capability for is to break current encryption schemes. With certain current encryption schemes, using a very fast digital processor, it could take millions of years to try every combination and crack someone's strong encryption. But with a decent quantum computer you could try out all the possibilities simultaneously and arrive at the answer almost immediately.

    That's the theory, anyway. How practical it will be to implement in real life is still up for debate.

    I'm guessing that quantum processors process varying amounts of data instead of one at a time. If this is right, then no wonder quantum processors are being described as phenomenally fast.

    Edit: I should have watched a YouTube video on how quantum processors work before making a post about them.
  • Options
    and101and101 Posts: 2,688
    Forum Member
    ✭✭✭
    The next big shift with computing will probably be with memristor technology. A memristor has the speed of RAM while holding it's state when power is removed, like solid state storage, so you basically have the best of both worlds.

    What this will mean is you can build a computer with one type of memory that can be used for both processing and storage. When you switch on your computer it would remember what you were doing when you turned it off so everything would continue as if the power had not been removed. It would also mean that the computer is not constantly fetching information from storage and moving it into ram so processing would be far faster with lower power consumption.

    The end result is computers that are smaller, faster and use less energy than today's machines.

    Here is an article on memristors.
  • Options
    AnachronyAnachrony Posts: 2,757
    Forum Member
    ✭✭✭
    zx50 wrote: »
    I'm guessing that quantum processors process varying amounts of data instead of one at a time. If this is right, then no wonder quantum processors are being described as phenomenally fast.

    For classes of problems are suited to this sort of massive parallelism, then yes. There are still certain algorithms where you just need really fast sequential logic, in which case digital will probably still come out the winner. But there are many things that would benefit greatly from this parallelism.
  • Options
    bobcarbobcar Posts: 19,424
    Forum Member
    ✭✭
    zx50 wrote: »
    A digital processor processes one bit at a time

    I missed this when I replied to your thread earlier. A digital processor can and usually does process multiple bits at a time and in modern devices will often do multiple/parallel operations on multiple bits at the same time.

    Don't get hung up on the binary nature of most digital electronics, that doesn't mean dealing with one bit then another etc it just means that the state at the lowest level will be either 0 or 1, For example the analogue input may be converted to 16 bits and if you add another 16 bit value to this then the addition takes place on all 16 bits at the same time (usually, there were serial computers - Ferranti for one made them - but this is not usual nowadays).
  • Options
    zx50zx50 Posts: 91,291
    Forum Member
    ✭✭✭
    bobcar wrote: »
    I missed this when I replied to your thread earlier. A digital processor can and usually does process multiple bits at a time and in modern devices will often do multiple/parallel operations on multiple bits at the same time.

    Don't get hung up on the binary nature of most digital electronics, that doesn't mean dealing with one bit then another etc it just means that the state at the lowest level will be either 0 or 1, For example the analogue input may be converted to 16 bits and if you add another 16 bit value to this then the addition takes place on all 16 bits at the same time (usually, there were serial computers - Ferranti for one made them - but this is not usual nowadays).

    Well, after watching a YouTube video just now, a scientist, or whatever, claims that the quantum processors will only be useful when it comes to complex calculations. It won't improve on program loading times and other similar things. Looks like the home user might not need quantum processors then.
  • Options
    bobcarbobcar Posts: 19,424
    Forum Member
    ✭✭
    zx50 wrote: »
    Well, after watching a YouTube video just now, a scientist, or whatever, claims that the quantum processors will only be useful when it comes to complex calculations. It won't improve on program loading times and other similar things. Looks like the home user might not need quantum processors then.

    To be honest I don't know enough about them to know what affect they'll have on home computers. The worrying thing is the decryption possibilities, if they do indeed turn out to be able to quickly decrypt messages then a lot of the internet as we know it will cease to function without an entirely new concept in encryption.
  • Options
    zx50zx50 Posts: 91,291
    Forum Member
    ✭✭✭
    bobcar wrote: »
    To be honest I don't know enough about them to know what affect they'll have on home computers. The worrying thing is the decryption possibilities, if they do indeed turn out to be able to quickly decrypt messages then a lot of the internet as we know it will cease to function without an entirely new concept in encryption.

    Looks like some new form of security will be needed before these are sold to the consumer, or put in home computers, tablets etc. If no new form of security can't be created before they're ready to hit the market, then they might only be suitable for places that need complex calculations performed.
  • Options
    spiney2spiney2 Posts: 27,058
    Forum Member
    ✭✭✭
    psionic. everyday life will become like p k dick novel .......

    cryonic. everything will be at almost absolute zero temperature so that quantum squid devices will work (for brain control apps etc ).........

    moronic ...... but i think that is already happening .......
  • Options
    spiney2spiney2 Posts: 27,058
    Forum Member
    ✭✭✭
    bobcar wrote: »
    I missed this when I replied to your thread earlier. A digital processor can and usually does process multiple bits at a time and in modern devices will often do multiple/parallel operations on multiple bits at the same time.

    Don't get hung up on the binary nature of most digital electronics, that doesn't mean dealing with one bit then another etc it just means that the state at the lowest level will be either 0 or 1, For example the analogue input may be converted to 16 bits and if you add another 16 bit value to this then the addition takes place on all 16 bits at the same time (usually, there were serial computers - Ferranti for one made them - but this is not usual nowadays).

    2 state logic usually depends on analogue devices inside the chips ........ only one bit at a time device is the original turing machine which is conceptual not physical .......
  • Options
    spiney2spiney2 Posts: 27,058
    Forum Member
    ✭✭✭
    bobcar wrote: »
    To be honest I don't know enough about them to know what affect they'll have on home computers. The worrying thing is the decryption possibilities, if they do indeed turn out to be able to quickly decrypt messages then a lot of the internet as we know it will cease to function without an entirely new concept in encryption.

    currently - ignoring bugs ike heartbleed and spyware etc - brute force attack is needed to crack the usual public/ private key encryption methods. all quantum computing would do is speed this up. it does not change the basic maths involved.
  • Options
    zx50zx50 Posts: 91,291
    Forum Member
    ✭✭✭
    spiney2 wrote: »
    currently - ignoring bugs ike heartbleed and spyware etc - brute force attack is needed to crack the usual public/ private key encryption methods. all quantum computing would do is speed this up. it does not change the basic maths involved.

    The signal of digital processors can either be up or down (on/off). The signal for quantum processors can be both up and down, and also left and right at the same time, I think. As the qubit number increases, the number of digits stored doubles/multiplies every time. If this processor turns out to be suitable for home computers, people will likely be utterly gobsmacked by the phenomenal power of these absolute beasts.
  • Options
    bobcarbobcar Posts: 19,424
    Forum Member
    ✭✭
    spiney2 wrote: »
    currently - ignoring bugs ike heartbleed and spyware etc - brute force attack is needed to crack the usual public/ private key encryption methods. all quantum computing would do is speed this up. it does not change the basic maths involved.

    Well according to what I've read (and I certainly don't pretend to know much about this, so anyone please come an correct me) what quantum computing will do is allow the result really quickly because the processing will "take the course" needed to produce the correct result and thus decrypt really quickly - quantum computing will calculate all possible results simultaneously but only the correct one will be read out. This does correlate with quantum particles existing everywhere until detected (again no expert).

    But whatever the mechanism if decryption is very quick then much of the internet will not function safely. The actual maths involved in encryption is not that difficult - well at least the algorithms aren't, reading the original papers made my head hurt but all I need are the algorithms. I myself have written AES encryption/decryption, HMAC etc software as is used on the internet, none of this is secret. The reason the encryption is safe is because so many computations are needed to decrypt without the key.
  • Options
    bobcarbobcar Posts: 19,424
    Forum Member
    ✭✭
    spiney2 wrote: »
    2 state logic usually depends on analogue devices inside the chips

    Well yes of course, the real world is analogue. However the actual logic in most VLSI is very much FET switched on opposing FET switched off or vice versa with a quick transition between the two states, this is vital to keep power consumption down, everything is designed for the circuit to be in one state or the other. The old bipolar (TTL) logic was much more complicated in terms of gate circuitry but even that involved rapid transitions between two states. Analogue ICs are very different and you can tell at a glance whether an IC schematic is digital or analogue.
  • Options
    zx50zx50 Posts: 91,291
    Forum Member
    ✭✭✭
    After watching a YouTube video, it looks like a quantum computer's been sold to Google and NASA. I doubt the consumer will be able to buy one in under 8 years, maybe more.

    Edit: Look how long it took for the early computers to become small enough to become the size of the desktop computers we have today. The quantum computer that D-Wave built isn't THAT big, but a lot of work will need to be done to reduce its size so that it will be the same size as a desktop computer.
Sign In or Register to comment.