Over the past decade, the automated generation of test inputs has made significant advances. Modern fuzzers and test generators easily
produce complex input formats that do systematically cover the input and execution space. Testing protocols, though, has remained a
frontier for automated testing, as a test generator has to interact with the program under test, producing messages that conform to
the current state of the system.
In this paper, we introduce language-based protocol testing, the first approach to specify, automatically test, and systematically cover
the full state and input space of protocol implementations. We specify protocols as interaction grammars—an extension of context-free
grammars that tag each message element with the communication party that is in charge of producing it. Interaction grammars embed
classical state models by unifying states, messages, and transitions all into nonterminals, and can be used for producing interactions as
well as parsing them, making them ideally suited for testing protocols.
By systematically
covering the interaction grammar and solving the associated constraints, FANDANGO achieves comprehensive coverage of the protocol
interactions, resulting in high code coverage and a thorough assessment of the program under test.
With more and more AI-generated code, comprehensive system testing becomes more important than ever. Our new paper "Language-Based Protocol Testing" (with Alexander Liggesmeyer and Pepe Zamudio), shows how to specify and test all details of how programs […]
[Original post on mastodon.social]