Developers usually like to stay up to date with cutting-edge technologies and trends and use them in their work. For people thinking about the product, some buzzwords might suggest the “next big thing” or something desirable to have in the company. It is important to carefully consider your pick. Read on to learn why it’s important to pay attention to tech trends, how to evaluate them and what to look out for.
Bleeding edge vs stable vs old
First, let’s address the use of libraries, software and services that are not production ready, for example in an alpha or beta stage. These technologies might have gained traction online already. We might have some kind of learning materials for them, but if they are not declared production ready, their producer might still change the APIs and the conception of how it works. While the beta stage is nearly ready, you might still see significant changes that would require you to rewrite something because the technology changed drastically at its core. It’s not wrong to consider using such a library, but in doing so you must take responsibility for where it may take you. Just remember – doing the opposite can be also problematic, because if you pick technology at the brink of its end of life, you might incur the cost of changing it later, or even becoming its maintainer.
While usually it’s the developers who pick the technologies to be used, their choice may have an impact on others as well. If maintenance and feature delivery time are impacted by a chosen technology, the project as a whole might suffer, as additional changes might be needed because of this decision. Though rare, technology stack decisions might be taken by people on the management side, rather than developers. Why? One example could be when someone feels the need to advertise the use of a specific type of technology from a marketing standpoint, whether to demonstrate how up to date the company is, or maybe from a recruitment standpoint, to show candidates that they will have a chance to escape legacy technologies, dated approaches and old ways of thinking. Is it a good way to make a decision? From a developer’s standpoint, most likely not, because technical choices should be taken based on solving problems.
Read also: What is a Single Page Application?
Industry standard vs niche
Let’s talk about the problem of choosing between more common technologies and something more niche. In the current Java ecosystem, the most widely adopted framework is Spring/Spring Boot. It does some things very well, but it’s been around for two decades now, which has opened the door for competitors to do certain things differently. This could be an attempt to try to fix a few things that Spring might have gotten wrong or simply a way to adjust to an evolving technological landscape that did not exist at the time of Spring’s inception.
Based on this example, here are a few factors that should be considered when going with wildly adopted technology.
The advantages of integrating talked-about technology
- You probably have developers in your company that are proficient in using it
- There’s access to a lot of skilled engineers on the market
- It easily integrates with a lot of technologies that are readily available
- There’s a chance of adding a lot of functions and extensions
- A large community means getting help when problems arise is relatively easy
- Many applications you use might be already running it
- As the technology has been around for years, most of the problems have probably been ironed out
- You minimize the risk of having a random technology stack without the necessary competences on your team to handle it
Questions to answer when integrating talked-about technology
- Are you only good at doing one thing? If you get stuck with a single technology, will you have a problem adapting to new requirements, whether business or technical ones?
- Are you just looking for people who memorized the documentation of single framework? What happens when they have to move into uncharted territory? Will they cope?
- It integrates easily, but do you actually understand how this integration works? Can you solve issues efficiently if you have limited knowledge of how things were done?
- Given it has a lot of functions and extensions… and bloat and things that are really complex underneath, won’t it be difficult to fully understand and time consuming when problems arise?
- Will you be finding answers to problems you wouldn’t have had if you had gone with other technologies? Essentially, will introducing new technology create new problems?
- By solving one problem with perceived complexity, are you introducing vendor locking? (Perceived, because complexity will still be there, probably even bigger, but in an obscured way)
This is only an example of the thought process you might go through. It may be applied to different areas, but I’ve picked examples I personally had to seriously think about at some point. Read more:
Read more: How to Make a Desktop application
AI and blockchain – two of the most talked developments
Every day is a day when a new advancement could turn the thinking about how we merge business with technology on its head. “The new big thing” sometimes is getting adopted not because we need it, but because it’s new, it’s marketable, because we want to be first. Recently there have been two big examples – blockchain and AI.
I find the integration of AI slightly less controversial, because its usage is a no brainer – it raises productivity across different disciplines, however going headfirst has caused some setbacks. Because of the principle of how AI chat works, there were some newsworthy leaks which resulted in some companies banning the usage of ChatGPT or replacing it with its own solutions. There are also legal problems – artists have protested its usage as chat creates a new work based on other copyrighted material. Similar problems occurred in the world of software development – GitHub Copilot faced a backlash for relying on copy-left code being used to boost proprietary projects. Now imagine the problems if you jumped ship early and have shifted your pipelines or business early and you end up in a legal gray zone. AI implementation should be handled by experienced engineers, as evidenced by this examples.
As for blockchain – if I was trying to prepare an ad for it I would probably sell the vision of anonymous bank transfers, deeds for your apartment or any other documents being handled by a democratized network where it’s impossible for something to get lost or forged, or just focus on it being faster, transparent, cheaper and more secure. That was not early adoption though. Let’s start with issues of 1st generation blockchain caused by technology background – a lot of implementations were based on proof of work, which required huge amounts of energy to essentially calculate the chain itself. Also, because of proof of work, during the pandemic when supply chain was disrupted and chips were in high demand, cryptocurrency mining caused additional strain and soring hardware prices.
What are some non-technical issues related to that tech? Well, a lot of people have gotten into cryptocurrency speculation – while some have made profits, others did not understand the associated risks and got into financial problems. Regarding the example with documents and deeds – there is no legal framework yet for it to be actually implemented, so it does not pose real value as of right now. The early adoption of non-fungible tokens (NFT) was also problematic, because it got mainstream media attention, and people bought into it without knowing what they were actually paying for. Right now, if you’d search for information about NFTs you’d find articles about it being a scam, money laundering scheme, financial pyramid, etc. These are early adoption problems that will be ironed out someday, by having a legal system or institutions like banks to back it up, technology advancements minimizing the downsides and also broader understanding of the concept among public.
Adopting emerging technology should not be a knee-jerk reaction
I showed some potential problems with early adoption, with mention of some upsides, but what is the lesson here? While it’s always a good idea to look forward into the future and try to get into “the next big thing”, it is something that has to be properly researched, so that risks and gains can be assessed, especially in terms of whether it makes sense in context of your business model. In many cases we could just replace existing mindset of making business/software to have it proudly written on a website or marketing materials “we are one of the first with …”, but if that’s not backed up by a tangible need for it or an essential improvement to something, it may be a waste of time and/or money.
In today’s rapidly developing technological landscape, companies across sectors are looking for proven experience and know-how on implementing emerging technologies. To learn how our experts can provide trusted consultancy and innovative software engineering services that will help you navigate existing, developing and future trends, fill out the contact form.
About the authorPrzemysław Kasprzyk
Senior Software Engineer
A Software Engineer with 7+ years’ experience in Java and Spring back-end development for financial services and HealthTech, Przemysław has also been involved in front-end, mobile and desktop app development. Apart from software engineering, he’s supported projects in various ways, such as Agile team leadership, code reviews, codebase maintenance and cybersecurity. A member of Software Mind’s Java Guild, he values constant improvements to workflows that drive high code quality, efficiency and app performance.