Skip to main content

Posts

Showing posts with the label vs

I've Been Vibe Coding for 2 Months, Here's What I Believe Will Happen

In the past few months, I've embarked on an experiment that has fundamentally changed how I approach software development. I've been "vibe coding" - essentially directing AI to build software for me without writing a single line of code myself. This journey has been eye-opening, and I'd like to share what I've learned and where I think this is all heading. My Vibe Coding Journey I started vibe coding with Claude and Anthropic's Sonnet 3.5 model, later upgrading to Sonnet 3.7, Claude Code, and other tools. My goal was straightforward but comprehensive: create a CRM system with all the features I needed: - Contact management (CRUD operations, relationships, email integration, notes) - Calendar management (scheduling meetings, avoiding conflicts) - Group management for organizing contacts - A campaign system with templates - A standalone application using the CRM's APIs for external contacts to book meetings directly The technical evolution of this pro...

When Bitcoin Hits $100K and AI Goes Mainstream: Whose Revolution Are We Really Watching?

Over the last decade, two seismic waves have taken shape in the digital realm. On one hand, we have the cryptocurrency movement, championed by Bitcoin and its ideological siblings, seeking to redefine money and power structures. On the other, artificial intelligence (AI)—and especially its recent breakout star, generative AI (GenAI)—is slipping into our everyday lives, reshaping how we work, create, and interact. As Bitcoin edges toward a mythical $100,000 valuation, and AI quietly proliferates, a fascinating contrast emerges: Are we witnessing the technological upheaval crypto once promised, or are we seeing existing institutions reinforced by the subtle genius of AI? Bitcoin’s Institutional Crown—and Limited Utility For years, Bitcoin has been heralded as digital gold, a hedge against uncertainty and the ultimate “people’s money.” If it hits $100K, it will symbolically confirm its acceptance by the very institutions it once aimed to circumvent. Big players—banks, asset managers, and ...

Are Small Language Models the future of GenAi?

I did a little experiment. I asked ChatGPT 4o (in the cloud) and the Llama 3.1 70b model (locally via Ollama), the same question: "What verse changes did Schiller make for Beethoven's 9th symphony?". Both models got the right answer, in fairness GPT 4.o did it much faster and had a much more elaborate answer than Llama3.1 70b that took a good 2 minutes (running on my desktop PC!). But in the end, I liked the answer from Llama 3.1 better because it was concise, to the point, minimalistic (as you would expect from a 70b model) and in a way more helpful.  If you want to follow the the experiment in real-time, I've recorded it for just that purpose. This is also the reason why this is a blog post and not just an update, because I can upload more than one video file.  ChatGPT 4o: Llama 3.1 (70b) via Ollama (jump to 50 seconds, it takes a while to load the model):  What do you think?

Is the payment industry finally being shaken up by … Apple?

My starting point for this is the evolution of the Android SmartPos and in particular the version at the lowest common denominator also known as SoftPos, which allows most Android phones to become NFC payment terminals. This is the domain of the Android operating system because Google has allowed unrestricted access to NFC since the days of GooglePay (2015ish), hence it doesn't surprise that when we talk about the SmartPOS, we really talk about the Android operating system with custom hardware and software. Now, we have known for some time that Apple will be entering this space (and has in fact in some markets) with the iPhone given the acquisition of Mobeewave. What is different to the Android ecosystem, however, is that Apple is again controlling every aspect surrounding the user experience. Let me try to reach a bit further and just to be clear: I’m not an expert when it comes to Terminal hardware and PCI security. Further I have not signed an NDA with Apple, so a lot of this i...

The new shiny armor of AI

If we listen to the media, business leaders, and the press, we should be getting behind the AI wagon because of its potential to automate many of the processes everyday companies struggle with. I don’t dismiss this notion entirely because I think it’s true if you have the ability to integrate this technology in a meaningful way. For example, the startup company " scrambl " (full disclosure, I’m a minority investor) is making use of gen-AI by "understanding" CVs (curriculum vitae) from applicants and identifying the skills to match them to open positions. This works great – I have seen this in action, and while there are some misses, most of that "normalization of skills" works. There are other promising examples, such as Q&A systems to understand the documentation of a complex environment. When combined with RAG ( retrieval augmented generation ), this has the potential to significantly reduce the time it takes to make complexities understandable. But ...

The case for central bank digital currency as public infrastructure to enable digital assets

I have dabbled a fair amount with all sorts of crypto currencies and their respective permissionless networks. In fact, I have been dabbling since 2012 which is by my last count a whopping 12 years. While I have always maintained that I do believe the general concept for digitization and programmability of assets is on the right path, its implementation, the user experience, the accessibility, the fraudulent activities, and the overall inefficiencies permissionless DLTs have, never made me into a true believer. I have stated that opinion on several occasions, here , here and here . There are still barriers to entry when it comes to digitization of assets: sustainable- and interoperable infrastructure. To illustrate this, I recently asked a notary public here in Zurich, why they can’t store the notarized documents as PDFs, the answer surprised me: because they must keep records for at least 70 years. Now, think about what would have happened if we stored these documents on floppy disks...

Apples Vision Pro Headset strategy is all about its Arm-chips.

  Apple has given us a vision of what their VR and AR future might entail. But as have others pointed out numerous times, the whole point of the showcase at the WWDC 23 was to let people experiment, I’ve heard others say that it’s like the launch of the Apple Watch when Apple didn’t really know what would become of it. This is similar and yet different.  Just like the Apple Watch (and the iPad before it), Apple sought to porting its whole ecosystem onto a watch – granted, the Apple Watch can’t live on its own and a better comparison would probably be the iPad. The iPad can live without any other Apple device and unlike the iPhone, never really had a clearly defined function other than to watch movies and browse the web. It was not until it gained the ability to be used with a pencil that artists and designers started to explore the potential.  I’m trying to point out that Apple took 5 years from the first iPad in 2010 to the iPad Pro with pencil in 2015 to find its “kille...

Is crypto the new dot-com?

Is Web3 the new world order, or just utopia?

As it is with all technological change, every industry must understand, assess, and prepare to incorporate the “new” with the existing. The payment industry, which is undergoing almost constant change, is no exception to this rule. But to understand this change, we must start at the beginning – with Web 1.0 or the “static web”, so called because you would load all the data at once from a web server. This would start to change at around the turn of the millennia with Web 2.0. In 1999, Microsoft experimented with what later would be called “AJAX” (Asynchronous JavaScript And XML) , which was an enhancement to its web browser so that it could asynchronously pull data from the web server without refreshing the whole page. This enabled websites to become more interactive and behave more and more like applications.  It was this underlying technology together with Javascript frameworks that enormously simplified content creation and self-publishing, and ultimately gave rise to social medi...

Is the Metaverse a future we want or a construct of commercial interest?

Quo Vadis, Metaverse? If you read popular Science Fiction, then you probably already have an idea of what the Metaverse is or at least how it is portrayed. Basically, we would be immersed in a virtual reality world with the idea of meeting, playing and working together in a fantastic setting while remaining physically separated. If you look at the difference in vision between Meta’s Mark Zuckerberg (who thinks we could all have meetings virtually or at least augmented) and Epic’s Tim Sweeny (Who believes the Metaverse should be a world for games and interaction), then you will see these visions are somewhat different. My ideal Metaverse is a global, immersive, digitally created computer environment that is accessible via the Internet, is open to everyone (and is governed by both commercial and non-commercial interests), may require devices that can translate between the virtual world and human senses, and allows for interoperability with other generated digital worlds.   But regar...

Information technology is no more

As many have during the summer, I have taken a break and did some reading. To my amazement, I have seen the term of “IT” or information technology in much of the literature that I read over the summer. And I started to wonder if we’re doing ourselves any favors by still insisting on using this term. I tried to answer that question for myself, and have come to the following realization:  The term information technology has its origin in a world where computers were largely absent – it was first mentioned in a 1958 Harvard Business Review  article . The article singled out three distinct elements: “technique of processing large amounts of information rapidly”, “statistical and mathematical methods to decision-making problems” and finally, “the simulation of higher-order thinking through computer programs”. The article further explained that information technology would have its greatest impact on middle- and top management. To be very clear, the article was not wrong and in fac...

Apple is Arming itself

I’m still following Apple and developments in hardware and software in general as it is still somewhat of a hobby of mine (having started my professional career developing on NeXT). Even after I have given Apple bad grades here and here , I think yesterday’s announcement that Apple is moving to their own Arm chips, is something that had long been coming and I think it will potentially alter the computing landscape. While I’m certainly no longer able to have insight knowledge or even be able to follow the last development in chip design, I do believe in phases or cycles of evolution and revolution in technology and I think that’s true for information technology as it is for cars or anything else for that matter. Apple just started to ramp up on a revolution cycle.   Reading the opinion of analysts regarding Apple’s risky move from Intel to their own Chips left me bewildered. After all, this is Apple’s third processor architecture change – from the Motorola 68K to the PowerPC a...

Make no mistake, we’re not going back to “normal”

Yes, there’s so much speculation as to what our medium- and long-term future might look like in a post-COVID19 world. If you read the newspapers and magazines ( here , here , and here ) you will have noticed that most reports talk about a delay of 18 months until we have a working vaccine. You can probably add another six months to that number until the production and distribution have scaled up; and then of course there are still question marks around herd-immunity and how long one stays immune after having built-up antibodies. In summary: we’ll be wearing masks for at least another two years. But you might also have read about accelerating the digital transformation of our lives and that we have significantly advanced topics that have lingered for 20 years, within the span of a month. For example, online grocery shopping and remote health care. Given that we now have a good two years to ease into a new reality, I have tried to think about what all this means for us as a specie...