Recent high-profile software supply chain breaches have sharpened the focus on application security. However, as cybersecurity professionals know all too well, concern doesn’t always equate to action. In theory, the rise of DevSecOps best practices that shift responsibility for application security further left should reduce, or outright eliminate, the vulnerabilities that now routinely make it into production applications. Unfortunately, it’s still early days as far as DevSecOps is concerned, so the impact this shift might have is, at best, limited, especially when you consider the level of security knowledge the average developer possesses. Cybersecurity professionals know in their bones that developers are the root cause of most of the issues they face daily. It’s not that developers deliberately build and deploy vulnerable applications; rather, they simply don’t know what to look for. By the time the application is scanned—usually a few days before it’s supposed to be deployed—it’s too late to do much more than make note of the security flaws that need to be addressed. Breaking that cycle will require cybersecurity teams to meaningfully engage developers much earlier in the application development life cycle.