Ubiquitous: existing or being everywhere at the same time
The concept of ubiquitous computing was created by Mark Weiser. He was a computer scientist who worked at PARC, a technology research department at Xerox. Perhaps the most famous invention of that same department was the GUI (Graphical User Interface) that later came to be acquired by Apple.
Ubiquitous computing or ubiquitous technology, starts from this premise:
- the main functionality of the technology is to solve a problem, if it's effective and efficient, it'll eventually not demand your attention.
When the solution is easy to understand and with easy access for the user, he ends up unconsciously using the technology, making it part of his daily life.
Currently, the human is surrounded by ubiquitous technology that he enjoys daily. The problem is lack of it in current computing environments.
A great example of a everyday ubiquitous technology is a electrical socket.
If there is an object that needs access to electrical current, the user just has to plug it into the socket. You don't have to worry about anything else, mainly about the functionality of the socket (that involves a lot of electronic concepts).

On a computer, the current scenario is quite different. In order for the user to normally be able to access something he is looking for, he has to carry out a series of different steps, thus making access to the solution difficult.
We can then say that the concept of ubiquity is directly related to the efficiency of the user-problem-solution process.
Technology should not fill us with too much information over time. This is counterproductive.
Human memory, unlike computational memory, cannot be greatly expanded and does not keep all the acquired information. The human brain works so that it can be as efficient as possible. One of the proofs of this is the ease of memorizing patterns or routines. When these patterns or routines are constantly repeated to the point of being unconsciously performed, there is ubiquity in actions.
Writing, which in reality is an information technology, is proof of this. Initially the brain devoted more resources to its learning. After learning and using it regularly, the brain does not need the same resources every time it is used. From that point on, it became a ubiquitous technology, freeing up brain resources so that more information could be acquired through it. If, whenever humans resorted to writing, they had to use the same brain resources they used during their learning, it would be something inefficient and counter-productive.
Computers
Ubiquitous computers are governed by the idea of being camouflaged in the environment, to the point that when they are used, they become practically indistinguishable. They end up being part of the environment.
“Multimedia computers” act in a non-ubiquitous way. They are computers where the screen needs the user's focus and attention constantly, never being part of the environment. One can consider today's personal computers as "multimedia computers."
Therefore, a ubiquitous computer is different than a “multimedia computer”, although both can use multimedia resources. An example of a current ubiquitous computer that makes use of multimedia resources are voice recognition and communication devices (computers) (eg Amazon Alexa).
“Furthermore, although ubiquitous computers may employ sound and video in addition to text and graphics, that does not make them “multimedia computers”. Today’s multimedia machines makes the computer screen into a demanding focus of attention rather than allowing it to fade into the background.”
Mark Weiser
Ubiquitous systems and security
A ubiquitous system, if implemented well, can provide better protection than current systems. If the model of “digital pseudonyms” was adopted, the need to resort to private data was reduced and could even be eliminated. Currently, big technology companies think otherwise. The more data they get about the user, the more control they have. All of this because the market is established in a way, which the important thing is not to primarily help the user, but to take advantage of it. The problem is that it undermines the user's privacy and increases distrust in technological systems.
Software systems must opt for security solutions based on the real world. They are limited and convenient. If all the data the system really needs were governed by this dilemma, many ethical discussions would end up not existing. Therefore, the data stored should be limited and convenient. Unlike modern architectures which automatically assume that for any developed system it is necessary to save user information, such as email, name, among many other data... Not all systems need a similar structure in order to the user enjoy them.
Sources:
[1] - The Computer for the 21st Century by Mark Weiser