What do these two have in common? One is the reality of the future and one is the imagination of the future. Let's back that shit up, right? If you look at views of the future we live in today from the perspective of individuals from the late 1800s you'll notice a unifying theme (Besides ubiquitous flight... can you blame them? Do you know what Broadway looked like in 1880? I'll give you a hint: Ocean of Horse Poop).
Let's try that again - you'll notice a unifying theme; they illustrate only how an entire human or system of humans will interact with a technology - a man standing on an exposed platform of a proto-zeppelin, people standing on a rail platform a hundred feet above the ground, etc.
The concept of technology was so new (and large) that the scale was too complicated to absorb. The best they could come up with was systems of levers and wheels. The concept of the need for technology to communicate with people hadn't even been thought of - you won't see anything more complicated than (literally) bells and whistles until the turn of the century. Gauges were a beautiful dream.
Around 1910 we see the rise of displays in fiction - but only for remote observation; pegboards, buttons, patch cables, switches, levers, tube systems - these are the way of the future. Communication between people is accomplished in a very handwavey fashion, as later in the 1950s and 60s by whatever futurists of the time decided to call "videophone." How could they possibly foresee machines which construct their own reality inside them to display to you - or augment your own reality?
The ubiquity of the moving picture changed all that, soon everybody though it would replace the written word. Here is a technology which is essentially preconstructed and recorded reality-in-a-box. Just turn the crank and shine a light through it!
By the 1920s and 30s, the concept of mechanical men and automatons had come into the public consciousness. Gone were the days of people thinking about how to make machines work - they just had to talk to their videophone and some mechanical man (filled to the brim with rebellious evil but shackled to the human will by the miracles of technology) would perform your every whim. It's safe to say that people at the time failed to understand the complexity of consciousness. This continues for the next 6 decades, however.
In 1945, a man named Vannevar Bush wrote of a device he described as the memex or memory extender, which would basically be a dual touchscreen Wikipedia, in concept anyway. The system would be collections of microfilm notes which were linked mechanically in train-of-consciousness linkages. The only problem with that is that it was single-player - meant to be used only as an extension to memory for a single person - and there was no technology that could do it at the time. A further weakness was the lack of searchability, and you'll see this in early computer-based sci-fi from the 1950s - the idea that people who are using computers have highly personalized systems which are really only usable by the primary user due to how rarefied people's personal filing systems tend to be.
The ubiquity of search engines was another unforeseen development in the modern way of doing things.
Ultimately, all of this came down to either a sort of "companion spirit" sort of benevolent AI or some form of stupid automation which provided information and calculations on demand - automation which had to be skillfully wrangled and required specialized learning.
The late 20th century and the beginning of the 21st century has, however, heralded the arrival of what I feel is the most important development in human thought: the invention of the concept of an "interface."
The advent of interface came rather as a surprise to everybody. The first computers were pre-set to run a task, then printed out their results on tickertape or inscrutable output from various registers. The inefficiency of working with that, once it became clear that working with computing machines at all was a good thing, became immediately obvious. Before too much longer, we had keyboards and monitors. And then we kept using them for 30 years. We'll probably keep using them for another 30 years. However, the arrival of the mouse (and pointing devices in general) heralded the arrival of the age of the graphical user interface. The first ones were not particularly popular.
An interface is, in its most general definition, an area where two systems collide and interact. In chemistry, the interface is the fuzzy area between two states or phases of the same material. In business, an interface is a system which connects two related but disjointed portions of business - marketing is the interface between production and the consumers. In computers, the interface is the facilitator (or barrier, as the case may be) between the user and the deep internals of the computer. User interfaces represent a bogus reality, created by consensus between the user and the machine, which facilitates communication between the two.
The user interface is such an important technology and scientific leap not because of its nature; the concept of a false or imaginary reality created by mutual consent for the purpose of facilitating communication or interaction between two intelligent entities is as old as the concept of the game, or the parable, or the metaphor. The significance of the user interface is more subtle - it creates a false reality which can be agreed upon between an intelligent agent and a huge library of dumb automation.
It is nothing more than a logical tool which facilitates the use of other tools. Without the simple interfaces we have created, whether graphical or mechanical, the workings of most of the modern mechanical and electronic systems we have invented would be beyond all but the most technical individuals. The interface allows nearly any human capable of logical thought and basic learning to interact with technology of arbitrary complexity - no matter how complicated the task the computer is performing, the interface can simplify it down to the most elemental levels. The user need only provide direction (hey, do this) and starting conditions (the basic data needed to start the automation) and the rest is performed by the computer - exactly as if it were another person or intellect performing the task.
Interface is so important because it allows dumb technology to take the place of an intelligent agent without requiring that the technology understand the task. It allows humans to delegate complicated jobs to entities incapable of thought.
Pretty heavy, no?