An Interview with Parasoft's Wayne
Ariola
by Frank Sommers
May 7, 2008
Advertisement
Summary
Discovering security problems early in the development cycle
is only the first step toward creating more secure and reliable applications,
says Parasoft's Wayne Ariola
in an interview with Artima. For developers to work
effectively in a security-conscious environment, addressing security-related
coding issues must be integrated in developers' daily workflow.
Agile development practices have promoted the idea of
continuous testing and defect detection. Not waiting for a separate Q&A
cycle to detect errors late in the development cycle makes errors easier and
cheaper to address. That's especially true of coding defects that can result in
security violations.
While most development tools integrate some form of testing,
such as allowing the running of unit tests from within an IDE, but do not
integrate security-related coding defect detection into developers' daily workflow.
Parasoft's Wayne Ariola
explains in this interview with Artima that such a close workflow integration makes it very convenient for
developers to fix security defects as soon as such defects are detected:
Discussion about
security in the industry has centered around a
perimeter approach. With that approach, operations folks fortify the perimeter
and build an infrastructure to prevent vulnerabilities or potential violations
or viruses to enter your environment. However, if you really think about it,
most of those malicious type items are counting on the underlying application.
If you build your application to be secure, that is your most solid line of
defense.
Developers
generally want to do the right thing with regard to security, but there is
always a communication gap between management and the development team.
Management may wish to focus on the functionality and making a deadline, and
may not be that cognizant of the security implications of the software that's
being developed and ultimately released. Developers, on the other hand, may
know about potential security problems, but they seldom have the opportunity to
communicate that to their managers, and especially to communicate that
regularly.
By the same token,
developers may also not have many opportunities to learn about the security
implications of their code. While many developers know about the basics of
security, it's not easy to relate those abstract concepts to the actual code
they're working on. Code reviews afford that opportunity, but those may not
happen early enough in a project where remediation of the security-related
problems can be done.
Traditionally,
most of your developers have been trained to crank out code: write code that
meets the requirement and that performs. There has been an education gap in
terms of how to write secure code. It's a gap that we've trained developers to
not to really watch out for over the years.
What we're seeing
is that late in the stage of the development cycle, when the application is
nearing completion, folks do exercises, such as code audits. That code audit
actually identifies potential vulnerabilities in the application. However, it's
usually far too late in the software life cycle where this information is being
provided: the remediation of those errors are almost
impossible to meet within the demands of time and product release.
There are two
things we need to be able to do: First, we need to push this type of analysis
back into the SDLC so we are able to detect errors when the application is
still in a relatively incomplete stage: looking for patterns in the code or
using other tools to dig into the code in order to predict if there are
potential vulnerabilities within that code structure. Second, once those
vulnerabilities are detected, we need to educate the developer, or the
organization, on the organization's security policy and raise the relative IQ
of the organization at the same time.
We can do that by
using automatic workflow and prioritization to make sure that the most
vulnerable issues receive the highest-priority treatment,
and that the highest-category errors are the most visible. Using those
techniques, we are able to deliver to the developer's desktop a prioritized
task list that shows them the elements they need to remediate based on
security. Those would be potential quality-related errors that should be
addressed immediately.
For example, if we
find an input-validation violation in the code you checked in last night, the
task list in the morning will show, inside your IDE, the need to remediate that
potential vulnerability. When you right-click on that task, you instantly get
access to the organization's best practices knowledge base. The best practice
knowledge base also includes a description of the error, a description of why
it is part of the corporate policy, and an example of how to remediate the
error.
That way, we are
not only helping remediate the error real time and within a workflow that makes
it seamless—just like answering your email when you come in in
the morning—we're also educating the developer at the same time so his
likelihood of committing that error again is diminished.
That's a very
helpful feature in a globally distributed and outsourced environment. You can
build a service-level agreement based on potential injection of vulnerabilities
by third parties. Once you can measure and highlight that, you have the
capability to reject or not accept code that does not meet a specific
threshold. That automates the process, and allows us to be more productive.
Our tool now
leverages task-management tools inside the IDE, such as Mylin
in Eclipse. So the security-related tasks that our tool creates, based on a
corporate security policy, will show up as Mylin
tasks inside Eclipse. We allow you to define workflow and task prioritization
that then ties into your organization's policies, and that integrates into your
developer tool for task management.
To what extent do you integrate security-related code checks
into your development workflow?
Post your opinion in the discussion forum.
Parasoft's Application Security
page:
http://www.parasoft.com/jsp/solutions/security_solution.jsp
Frank Sommers is Editor-in-Chief of Artima Developer. He also serves as chief editor of the IEEE Technical Committee on Scalable Computing's newsletter, and is an elected member of the Jini Community's Technical Advisory Committee. Prior to joining Artima, Frank wrote the Jiniology and Web services columns for JavaWorld.