Skip to main content

Posts

My DIY home automation in a rental apartment

I emphasise  the fact that we rent and not own - this is important because it’s always been the domain of home owners to do any sort of home automation - at least, it’s been perceived that way. I’m here to show how this doesn’t have to be the case and that you can do a lot of home automation even without the need of an electrician through the sheer magic of software.  However, I should note that you have to be handy with tools and in my example, you should understand your mains electricity. This is neither a guide or a how-to-manual, it’s simply to document what I had done over the last two years (coinciding with the appearance of a certain virus) to make my home smarter. Before I start this excursion, I’d like to highlight what some of my goals were. Clearly, my number one goal was to save electricity or rather, not waste electricity when we do not have to have the lights on or the TV or whatever else could have a power switch.   My second goal was to do it in such a way...

Is Web3 the new world order, or just utopia?

As it is with all technological change, every industry must understand, assess, and prepare to incorporate the “new” with the existing. The payment industry, which is undergoing almost constant change, is no exception to this rule. But to understand this change, we must start at the beginning – with Web 1.0 or the “static web”, so called because you would load all the data at once from a web server. This would start to change at around the turn of the millennia with Web 2.0. In 1999, Microsoft experimented with what later would be called “AJAX” (Asynchronous JavaScript And XML) , which was an enhancement to its web browser so that it could asynchronously pull data from the web server without refreshing the whole page. This enabled websites to become more interactive and behave more and more like applications.  It was this underlying technology together with Javascript frameworks that enormously simplified content creation and self-publishing, and ultimately gave rise to social medi...

Is the Metaverse a future we want or a construct of commercial interest?

Quo Vadis, Metaverse? If you read popular Science Fiction, then you probably already have an idea of what the Metaverse is or at least how it is portrayed. Basically, we would be immersed in a virtual reality world with the idea of meeting, playing and working together in a fantastic setting while remaining physically separated. If you look at the difference in vision between Meta’s Mark Zuckerberg (who thinks we could all have meetings virtually or at least augmented) and Epic’s Tim Sweeny (Who believes the Metaverse should be a world for games and interaction), then you will see these visions are somewhat different. My ideal Metaverse is a global, immersive, digitally created computer environment that is accessible via the Internet, is open to everyone (and is governed by both commercial and non-commercial interests), may require devices that can translate between the virtual world and human senses, and allows for interoperability with other generated digital worlds.   But regar...

My adventures in computer restoration

TL;DR I've recorded the outcome:  I am writing these lines on a computer system that has been in storage for almost 30 years. To be concrete, I am writing this on a NeXT cube with WordPerfect. Welcome to the world of vintage computing!   Goethe, the giant of German literature, concluded his experiences by bringing them to paper. I have taken that wisdom for myself on previous occasions and thus see this as essential to conclude not just a fun project but also something that has been nagging me for over 20 years. I started to collect computers around 1988. First, as a way to make some money (buying and selling) but it became clear to me that personal computer technology would represent an era in human development that would be akin to the most important developments of human achievements; so I started collecting computers that I thought would be a relevant representation of this era. Some of these computers I have used personally and, in my work, and are thus “sacred” to me. My...

Information technology is no more

As many have during the summer, I have taken a break and did some reading. To my amazement, I have seen the term of “IT” or information technology in much of the literature that I read over the summer. And I started to wonder if we’re doing ourselves any favors by still insisting on using this term. I tried to answer that question for myself, and have come to the following realization:  The term information technology has its origin in a world where computers were largely absent – it was first mentioned in a 1958 Harvard Business Review  article . The article singled out three distinct elements: “technique of processing large amounts of information rapidly”, “statistical and mathematical methods to decision-making problems” and finally, “the simulation of higher-order thinking through computer programs”. The article further explained that information technology would have its greatest impact on middle- and top management. To be very clear, the article was not wrong and in fac...