In this blog series I will explore several known bad developer behaviors that lead to insecure software, as well as how we can combat them by applying behavioral economic interventions. This series is an expansion upon my thoughts from my conference talk ‘Threat Modeling Developer Behavior: The Psychology of Bad Code’. The slides are available here. The conference talk is available here.

Blogs in this series:

  • Part 1 – The Psychology of Bad Code
  • Part 2 – Building Systems That Support Secure Developer Behavior
  • Part 3 – not yet written

What if insecure code is not a result of laziness, a lack of knowledge, or malice? What if software developers are doing their best, but they have been set up to fail?

How People Make Decisions

I have never understood why people make the decisions that they do, and it is something I have always been quite curious about. I have misunderstood social situations many, many times, especially when someone has attempted to show romantic interest in me (they practically need to hit me over the head to get me to notice). People have told me that I am too forward and blunt, and I have never understood gentle nudges. I am a person who tries to use logic to make decisions whenever possible. When I have seen others making decisions that I felt were illogical, I had assumed they were using their emotions to make those decisions. Trying to understand other people has been both a struggle and fascination for me, for a very long time.

Somewhat recently, I discovered behavioral economics, the study of how humans make decisions… And a whole new window into other people’s thoughts was opened to me. Although I have read books previously on therapy and psychology types of topics, behavioral economics focuses on motivations in a different way that makes a lot of sense to my science-loving brain. As I read more books and saw some of the ideas play out in real life before me, I wondered if I could apply these ideas to my work.

Tanya presenting on behavioral economics and developer behavior at a conference, standing behind a podium with a dark curtain backdrop.

First, I started thinking about nudges, because I read the book Nudge by Richard Thaler & Cass Sunstein. Was there a way that I could nudge software developers into writing more secure code? Was there a way that I could work this into my training and the communities that I built, so we could all get better results?  For those of you who don’t know me and my work, I’ve been trying very hard to change our industry for the better. I want the world to create more secure software. And whatever way I can help achieve that, I’m in.

A speaker presenting on stage, wearing a purple dress and black boots, gesturing with one hand and engaging the audience.

Back to the economics…

Here I was, working at Semgrep, and they asked me to try to get a speaking slot at Black Hat. At that point I had been rejected I think 6 or 7 years in a row, and I was really not in the mood for more rejection from them… But the call for papers asks for ‘novel’ research, and I thought maybe this could be it. No one else (that I could find on the internet at the time) had tried applying behavioral economics to application security problems as far as I knew, and therefore it might be ‘novel’, and so away I went to start my research.

Spoiler alert: black hat has never accepted any of my talks for their conference, and this one was no exception. But OWASP did, and the rest is history!

Since behavioral economics is about behaviors, I started looking into what are ‘known bad’ developer behaviors. Developers make the code, so I figured I needed to work on them, plus I was one for 17 years, meaning I am one of them, which helps me understand them better. I wanted to see if there were any ‘known bad’ behaviors that I knew would cause insecure code. OMG there are so many well documented bad behaviors! And I could see, with my application security hat on, how they turned into less-secure decisions and outcomes. From there I made a list of ten that I wanted to figure out if I could find a way to try to fix, avoid, or improve them. Here’s what I came up with:

Known Bad Developer Behaviors

  1. Vibe Coding (AI-assisted, fast, context-less)
  2. Tight Deadlines → Insecure Shortcuts
  3. Copying from Online Forums Without Context
  4. Obsession with New Tech / Shiny Frameworks
  5. Avoiding Documentation
  6. Repetitive Code / Lazy Error Handling
  7. Committing Untested Code
  8. Overengineering / Showing Off
  9. Ignoring Compiler Warnings
  10. Not Reviewing Code (or Fake Reviewing)
Tanya presenting at a conference, wearing a purple outfit and gesturing with hands while speaking.

From there I wanted to investigate WHY people behaved these ways. I truly believe that developers, and everyone else that work on software, care about the final product. I believe they want to create high quality apps, and that almost everyone now-a-days values safety, privacy, and security in the apps they build and use. You can argue with me about it if you want in the comments, but I do not believe it for one second when people say “devs don’t care about security”.

I believe, and have believed all along, that developers have way too many demands and priorities set out for them and they are trying to appease many people, teams, and requirements, and that security is just one of many. I think they care about security, because it’s a part of quality. And if they were doing these bad behaviors because of behavioral economics, not because “they don’t care”, then my belief would be correct. Confirmation bias here we come!

For each one of the behaviors, I tried to match it to a cognitive bias and/or heuristic. Time for some definitions.

A cognitive bias is a systematic and recurring error in thinking, processing information, and interpreting one’s surroundings. These mental shortcuts are unconscious and lead to deviations from objective rationality in judgment and decision-making.

A heuristic is either a rule or piece of information used in or enabling problem-solving or decision-making, OR proceeding to a solution by rules that are only loosely defined.

They mean that we are on autopilot when making these types of decisions, and obviously that’s not good.

In the coming blogs I will go over each of the 10 behaviors, the corresponding cognitive biases and/or heuristics that I think apply, and most importantly, what we can do about them!

For now, I want to give you something you can do right away. I believe we can improve any AppSec program by applying these 3 ideas:

  1. Design technical nudges (e.g., secure-by-default templates, inline tool warnings, regular educational messages/events/options)
  2. Incentive shifts (e.g., recognize simplicity, reward documentation, train for better outcomes, praise good behaviors)
  3. Cultural changes (because they can scale indefinitely, where willpower cannot).

In the next 3 posts I am going to go over the 3 ideas above, with ideas for you to apply to program, workplace or organization. Because maybe you don’t know which bad developer behaviors are happening, but you know for sure that your code be better. After those 3 “these apply to everyone” posts, I will dive deep into each of the 10 bad behaviors above.

In the comments, please tell me if you have any ideas for creating nudges, improving incentives, or for culture change. I would love to hear your ideas!

1 comment

Leave a Reply

Discover more from SheHacksPurple

Subscribe now to keep reading and get access to the full archive.

Continue reading