Tech is broken and AI won't fix it

Watch talk on YouTube

Recruitment & Staffing

The background for this talk is a very bad experience in this sector

  • “We want someone opinionated” -> No you don’t

We don’t know how to recruit engineers

  • How many rounds of interviews?
  • Technical tests?
    • Not allways show your real worlds skills
    • Many different kinds that usually don’t fit (telling a junior)
    • Other fields don’t do this
  • Is there an accurate feedback loop?
    • The anwer to the qustion “how was the interview process” usually depends if there is an result and the kind of result
  • Many just try to follow fang
    • Plot twist you are not fang, you are a small company
    • Mostly tests resilience and not really technical skills

We’re not very good at training for the future

  • Many companies “only recruit senior engineers”
  • Junior engineers are sometimes viewed as a cost center
  • But they are important:
    • Force seniors to explain stuff better
    • Force collobaration
    • Are a bit like the usual clueless customer

Teamwork

  • Depending on the team size and skillset the requirements for leadership change (Lead Engineer vs Engineering Manager)
  • Don’t hang up on stuff like “this is our team’s senior/junior engineer” - everyone servers a role that changes depending on context (everyone is )

Agile is broken

  • The core of agile is about
    • Individuals&Interaction > Process&Tools
    • Working Software > Documentation
    • Customer collaboration > Contract negotiation
    • Responing to Change > Following a plan
  • Agile never planned for all of the meetings
  • If you have to have standups: Keep them short with rules

Bias

  • Everyone has biases and uncontious biases
  • Example: The doctor test “I went to the doctor” -> Most people assume a man
  • AI results in the same stereotype “Draw a doctor” -> While man with glasses
  • Reality usually contradicts these biases

Coding

  • In the old days we copied from stackoverflow, now we let ai do the copying
  • If GenAI uses LLM it bases everything on past code, will we reach a point where noting is new?

What can we do

  • Be an empathetic human (by gut feel or just by asking others)
  • Develop engineers
  • Never say something is shit
  • Treat AI as an unreliable colleague (never assume it’s allways right or wrong)