The Nobel Prize in Literature 2017 was awarded to Kazuo Ishiguro "who, in novels of great emotional force, has uncovered the abyss beneath our illusory sense of connection with the world".
I have now read through "half of his novels published in English", i.e. four of his novels. I am not sure if anyone has ever been awarded the Nobel Prize in Literature based on a smaller production, but after reading through some of his novels I have not seen a text more beautifully presented than what I have experienced in his novels. I would like to draw the attention to three of his novels presented in three very different historical periods of England, one contemporary (Never Let Me Go. – London : Faber & Faber, 2005), one by a butler reminiscing from the "glory" of the upper class England between the first- and second world war (The Remains of the Day. – London : Faber & Faber, 1989) and then the third from the rural England in the middle ages with ever ongoing conflicts between the Saxons and the Britons (The Buried Giant. – London : Faber & Faber, 2015).
The key is that Kazuo Ishiguro actually tells his story with the distinct language/dialects that was actually used during all these three historical periods and he does it with ease and an elegance that I for one has never ever experienced. Its as if he has been spending hours on mending and compiling each English word into a sentence not leaving anything to vagaries, his language is simply amazing - Kazuo Ishiguro is truly one of the most deserved Nobel Prize in Literature laureates ever, despite his somewhat minuscule production...
The intention of this blog is to share views on topics like innovation, science, enterprise architecture and even nature photography, literature, travels, motorbiking, scuba diving etc
Wednesday, December 6, 2017
Quantum Computing - from a novice point-of-view
Quantum Computing promise to change the world in the ways we will do computing and solve problems that we are unable to solve with traditional computers based on silicon transistors. We all have observed the physical limitation of the incumbent transistor gate that is currently down to about 10nm of contact length and provided the researchers can successfully use nanotubes as the new material, we may see 5 nm transistors in the near future.
Conceptually this technology and how the Quantum Computer is built could not be more different. Therein lay also the challenges and application constraints of the two technologies. Our conventional silicon based computers have already stopped scaling vertically due to heat issues when trying to reduce the contact length of the transistor gates, thus we are now resolving this issue by building multi-core systems, i.e. a horizontal scaling scheme requiring new programming models applied each individual operating systems as well as individual applications that enables different threads to be processed in parallel in multiple cores.
As it so happens, parallelism is what is the strength of the Quantum Computer.
Conventional digital computers store information as binaries, i.e. as bits that takes a value of either 0 or 1. Quantum Computers use the superposition of quantum states of particles, i.e. they use atoms, electrons, ions or photons to store information in elements called qubits.
As qubit can exist in superposition they can represent 0 or 1 or a superposition of 0 and 1 at the same time:
Unlike an electrical circuit, qubits are tiny particles (atoms has a size of 0.1-0.5 nm) that are magnetically suspended (or suspended by laser beams) in an extremely cold environment - fractions of a degree above absolute zero. What's so clever about this is that by keeping these particles in a state of superposition, they can simultaneously take on the role of both the 0 and the 1 in binary code.
Because qubits can contain these quantum multiple states simultaneously, Quantum Computers have the potential to be millions of times more powerful than today's conventional supercomputers.
The information in a qubit is encoded into its quantum properties such as the spin of an electron. The number of possible states equals 2^n for n qubits, i.e. two qubits can process four states simultaneously (in parallel) and 6 qubits can process 2^6 = 64 states simultaneously. Each qubit can give just one answer when measured (under the mind-bending rules that govern the quantum world, measuring or even observing a subatomic particle alters it), but the superposition of states provides extraordinary processing power (if we can work out how to build them into a computer).
Currently the future quantum computer is targeting specialised parallel tasks including complex computations of possible folding configurations of complex molecules. Protein folding is an enormously complex computational problem (ref DNA/RNA) that is near impossible to solve with conventional computers.
Google has built a Quantum Computer with 9 qubits, IBM Research claims they will be able to build a ~50 qubits Quantum Computer in a few years, conversely IBM Research has also achieved to build a 5nm transistor based on silicon nanosheets that promises a 40% performance improvement over todays 10-14nm chips. To me, this pushes the Quantum Computer further into the oblivious future as we incline to lean towards conventional digital computers that are general purpose computers.
Conceptually this technology and how the Quantum Computer is built could not be more different. Therein lay also the challenges and application constraints of the two technologies. Our conventional silicon based computers have already stopped scaling vertically due to heat issues when trying to reduce the contact length of the transistor gates, thus we are now resolving this issue by building multi-core systems, i.e. a horizontal scaling scheme requiring new programming models applied each individual operating systems as well as individual applications that enables different threads to be processed in parallel in multiple cores.
As it so happens, parallelism is what is the strength of the Quantum Computer.
Conventional digital computers store information as binaries, i.e. as bits that takes a value of either 0 or 1. Quantum Computers use the superposition of quantum states of particles, i.e. they use atoms, electrons, ions or photons to store information in elements called qubits.
As qubit can exist in superposition they can represent 0 or 1 or a superposition of 0 and 1 at the same time:
Unlike an electrical circuit, qubits are tiny particles (atoms has a size of 0.1-0.5 nm) that are magnetically suspended (or suspended by laser beams) in an extremely cold environment - fractions of a degree above absolute zero. What's so clever about this is that by keeping these particles in a state of superposition, they can simultaneously take on the role of both the 0 and the 1 in binary code.
Because qubits can contain these quantum multiple states simultaneously, Quantum Computers have the potential to be millions of times more powerful than today's conventional supercomputers.
The information in a qubit is encoded into its quantum properties such as the spin of an electron. The number of possible states equals 2^n for n qubits, i.e. two qubits can process four states simultaneously (in parallel) and 6 qubits can process 2^6 = 64 states simultaneously. Each qubit can give just one answer when measured (under the mind-bending rules that govern the quantum world, measuring or even observing a subatomic particle alters it), but the superposition of states provides extraordinary processing power (if we can work out how to build them into a computer).
Currently the future quantum computer is targeting specialised parallel tasks including complex computations of possible folding configurations of complex molecules. Protein folding is an enormously complex computational problem (ref DNA/RNA) that is near impossible to solve with conventional computers.
Google has built a Quantum Computer with 9 qubits, IBM Research claims they will be able to build a ~50 qubits Quantum Computer in a few years, conversely IBM Research has also achieved to build a 5nm transistor based on silicon nanosheets that promises a 40% performance improvement over todays 10-14nm chips. To me, this pushes the Quantum Computer further into the oblivious future as we incline to lean towards conventional digital computers that are general purpose computers.
Subscribe to:
Posts (Atom)
Løgner og arsenikk
Fra i dag kan boka kjøpes i hvilken som helst bokhandel og bokhandel på nett:
-
Diskusjonen går høyt i dagens medier vedrørende bruk av Kinesiske Huawei i vårt fremtidige 5G mobilnett. Dette er bra, ...
-
Internet has been an industry revolution; the second after Ford revolutionized the car production more than a hundred years ago. Internet ha...
-
Kommunene og Samferdselsdepartementet er både handlingsløse og kunnskapsløse i sparkesykkelsaken på tross av at løsningen er meget enkel. ...