When was the last time inverting a binary tree helped ship a product customers love? 2010 called, and it wants its leetcode back.
Back in January I wrote about the death of software design. I recommend you have a read first as I discuss the impact AI Coding Assistants can have on the design aspects of software engineering.
Now’s time to answer the question: what defines a great software engineer in 2025? With AI tools able to write functional code that solves complex problems, we need to rethink our approach to hiring and developing engineering talent.
So long, technical interview
Technical interviews suck. They always have.
Companies usually fall under two main camps:
Algorithm-heavy, leetcode style interviews
Take-home assignments
When modern LLMs can solve such contrived problems faster than most engineers, the first style of interview simply isn’t relevant any longer. What are we really testing then?
As for take-home assignments? Candidates get asked to build a complete full-stack application, including authentication, database design, and automated tests, as part of this exercise.
It creates unfair barriers for parents and caregivers
All this is meant to be done for free, in the candidate’s “spare time” while working a full-time job, managing other interviews and, you know, having a life.
These have always been fraught with problems:
they fundamentally disrespect candidates' time and existing commitments
they create unfair barriers for parents and caregivers
they ignore the collaborative nature of software engineering
they completely miss how AI tools are changing the way we work
After two decades in the industry, the most impactful projects I’ve worked on had very little to do with algorithmic prowess and everything to do with customer value.
In 2011, way before Gen AI was viable and could help me write code, my team and I created a web application over a weekend that was able to collect ~$24 million in donations, without having to invert a single binary tree.
More recently, at Optus, my teams have consistently delivered customer value by leveraging AI in our products. And we did not have to implement QuickSort by hand.
I don’t want to downplay the importance of understanding computer science fundamentals. Big O notation , data structures and classic algorithms are critical skills — i.e: should I use breadth-first search or depth-first search? Should this data be stored in a list or a hash table? How should I sort data that doesn’t fit in memory? — My point is this: surprisingly little of commercial software engineering gets to this level.
In both of the previous examples, the skills that proved valuable are:
Deep understanding of the business needs
Relentless focus on customer value
Expertise in system design and different architectural patterns
Code is out. What’s in?
while AI excels at the writing of the code, engineers need to master what happens around it
AI is slowly but surely turning code, actually writing it, into a commodity. Writing code is progressively diminishing its status as a differentiator between software engineers. In addition, writing code is a surprisingly small part of what a software engineer does, as shown in the chart below.
This means all other activities and skills will have their importance amplified: while AI excels at the writing of the code, engineers need to master what happens around it:
Critical Thinking & First Principles: This has always been important but will become more relevant in the coming years. The ability to decompose complex problems into fundamental parts, question underlying assumptions, evaluate AI solutions through a business lens and design new ways to leverage AI capabilities.
System Design & Operational maturity: Writing a POC is very different from running and maintaining reliable, performant systems in production environments. You have to answer critical questions about reliability, acceptable failure rates, service restoration times, and deployment efficiency.
Business & Product Acumen: This is about connecting the code you write and the systems you build to product outcomes. Knowing the impact on your users and business isn’t just for product managers. Software engineers need to demonstrate enough commercial maturity to be successful.
AI fluency: Understanding where AI can be most useful, and seamlessly integrating different tools into your workflow to maximise customer value. These include activities across various stages of the development lifecycle: brainstorming, documentation, code, observability, security… Don’t let AI become a hammer.
The most effective engineers will be the ones who deeply understand customer pain points and business objectives. They can translate abstract requirements into concrete technical solutions that are robust and can scale. They are able to balance technical expertise with business impact.
Reimagining the Technical Interview
The definition of insanity is doing the same thing over and over again but expecting different results.
Regardless of which camp your organisation is in (leetcode or take home exercises), if we want to evolve and take our engineering teams to the next level, we can’t keep interviewing software engineers the same way we have done in the past. I’ve given this a fair bit of thought, and I recommend the following structure as a starting point.
This process is composed of four distinct parts, which may be covered in 3-4 interviews of 45 minutes each.
1. Problem Discovery & Exploration
Rather than jumping to solutions, we need to assess how candidates deal with ambiguity and how they are able to identify core constraints and trade-offs. How do they connect technical solutions with business goals? Can they methodically break down complex problems?
Think of this as the engineering equivalent of product discovery. They need to understand the problem space before diving into solutioning mode.
2. AI Collaboration
Watching how [candidates] use [coding copilots] is incredibly insightful and trying to prevent this from happening isn’t doing you any favours
This is an entirely new area we need to assess and will evolve over time. At a minimum, candidates should be able to:
Create effective prompts for AI tools like Github Copilot or Cursor: these tools are only as good as your prompts. Asking good questions is still an incredibly valuable skill, except this time candidates are asking them to a Coding Copilot.
Apply critical judgment to AI-generated solutions: does the candidate just copy and paste a solution? What about performance and security concerns? Is the output modular and testable? As highlighted in my previous article, these tools tend to spit out pretty average code. Can candidates reason about its trade-offs?
Orchestrate AI outputs into cohesive solutions: in a way, this is about how these tools fit into the broader solution architecture
The ultimate goal of this stage is to assess how candidates can leverage AI in their day to day. Do it in person if possible. If not, ask them to share their screen and tell them it’s Ok if they are using Coding Assistants. Watching how they use such tools is incredibly insightful and trying to prevent this from happening isn’t doing you any favours.
A good way to do this is in a pair programming session, using an existing codebase which includes a few bugs and a scaffolding for a few new features. This offers some advantages:
It’s respectful of the candidate’s time
Presents a more realistic setting and problem space
Doesn’t artificially constrain tool usage
This structure is similar to the technical interview I had as an Engineer over 10 years ago at both Atlassian and ThoughtWorks. There was a pair programming interview where the interviewer fired up IntelliJ with an existing Java project that had a few bugs in it. During the test, I was able to use any tools I needed to solve the bugs and implement new features, always explaining the reasoning behind my next step to the interviewer. It was a pleasant experience.
I see this stage in a similar way — using IntelliJ as a Java engineer is non-negotiable and I expect that using coding CoPilots will follow suit in due time.
3. System Design & Operational Maturity
foundations still matter
As writing code becomes easier, system design becomes even more important. Many companies already have a system design interview and have reported many successes interviewing in this way. System Design Interview by Alex Xu (vols. 1 & 2) are great resources to help engineers prepare for this stage.
The ability to design resilient, scalable systems, make informed technical trade-offs, consider operational implications and communicate complex concepts clearly remains crucial and, at least for now, something reserved for humans.
This stage is here to show that foundations still matter. To be successful, you still need to understand data structures, the tradeoffs between relational, document and graph databases, observability & distributed tracing etc… — but all this in the context of solving a real problem, not a contrived puzzle.
4. Delivering Value
A classic for a reason, this needs additional emphasis in this new version of the engineering interview. Successful candidates need to demonstrate how they:
Align technical decisions with business outcomes
Navigate competing priorities effectively
Handle ambiguous requirements with confidence
Push back constructively when needed
At the end of the day, the best architected system with no users is worthless.
We need to care about the technical details but we have to remember that the technology is there to assist in solving real problems, experienced by real people. As I said in my article about turning 40, being an idealist is fine — just don’t forget the bigger picture.
Investing in the next generation
[…] we have to invest in the next generation of software engineers. It’s the only way we will succeed in the long run.
The framework I described above works well for mid-to-senior+ engineers who already have a solid foundation. But how do we ensure new engineers develop strong fundamentals in an AI-first world?
There's an interesting tension here: AI tools can help engineers be more productive, but they can also create knowledge gaps if relied upon too early. (I suppose the same can be said of StackOverflow, but on steroids!). It's like learning math with a calculator — useful once you understand the core concepts, potentially harmful if used before developing that understanding.
The good news is that we don’t need to reinvent the wheel here. The following are some approaches I’ve used in the past which can work just as well:
Structured learning paths: Create dedicated time for learning foundations. Book clubs, user groups and workshops work well wonders. I guarantee you have great teachers in your staff right now who would be only too happy to share their knowledge — be it implementing basic data structures, writing SQL queries by hand, or designing simple systems from scratch.
Pair programming & Hackathons: Partner junior engineers with senior engineers in "AI-assisted vs manual" exercises. Have them solve the same problem both ways, then compare approaches. This builds critical judgment about when to use AI.
Progressively introduce tooling: I’m a little torn on this one and may still change my mind but for now, don’t give CoPilot access to graduate or junior engineers on day one. The first 3-6 months is the time to hone in the basics. This is an investment in their education and a risk reduction investment for your organisation. Only then introduce Coding Copilots.
The temptation is high. As a business leader you can see the productivity gains but trust me when I say that we have to invest in the next generation of software engineers. It’s the only way we will succeed in the long run.
Putting it into practice
Ok so you want to give this a go. But how do you get started? While there's no one-size-fits-all approach, here's what I think might increase your chances of success.
Don’t worry about AI taking your job. Instead, worry that someone who uses AI effectively will take your job.
If you're leading engineering teams, start by rethinking your interview process. Ditch the leetcode tests and take-home exercises. Maybe keep one lightweight coding exercise to ensure basic competency — especially for more junior roles — but shift your focus to real-world problem exploration.
Giving candidates an open-ended, messy problem means that you can observe how they navigate the ambiguity, which tells you far more about their potential than whether they can implement QuickSort without StackOverflow.
Add AI tools to your technical assessments too. Like I described in the previous section, let candidates use Coding CoPilots. Watching how they interact with the tool will give you a lot of insight into how they think and what they care about.
For the engineers out there, invest in the foundations: data structures, databases, cloud infrastructure. Then, zoom out and understand system design and architecture more deeply. This is how you will set yourself apart.
But invest just as much time developing your business acumen and critical thinking. Read business books. Understand how your company makes money. Learn to articulate technical decisions in terms of business value. Use the AI tools at your disposal to develop proficiency.
Organisations as a whole need to rethink how they develop engineering talent
More importantly, don’t rush to throw out everything you’ve been doing. Take a progressive approach and start small, experiment, and adjust based on what works for you. Test & learn is just as important here as it is in product development.
The Next Step in Engineering Excellence
The software engineer of 2025 and beyond will look different. They bridge business needs, technical solutions, and AI proficiency, creating value through end-to-end product ownership in addition to writing great code.
Success, then, requires a shift in how we think about engineering talent.
This has been my best stab at what we will need going forward. I might look back in 12 months and overhaul the whole thing. That’s the beauty of learning. That’s why we experiment.
How are you adapting your hiring process given the changes in how engineers and product managers work? What skills are you prioritising in your engineering teams? I’d love to hear from you in the comments.