To do that, students translated dense legislative language into standardized variables. This approach allows analysts to test different questions over time, using employment data, investment patterns or other outcomes without rebuilding the dataset from scratch.
For GAO analysts, the value wasn’t just the data itself, but what it made possible. “The final deliverable provides us a really great resource,” Wright said. “If we have questions from congressional staff down the road about state legislation, we can go to the tools the team provided rather than starting from scratch.”
Faculty advisor Jordan Pallitto emphasized that the project mirrored the kind of work in which professional analysts often engage. “This project sits right in Heinz College’s wheelhouse,” he said. “It’s a blend of qualitative and quantitative thinking applied to a real-world problem.”
When the data complicates the story
As the team began analyzing the data, some findings challenged common assumptions. One example came from case studies they created to compare states with different incentive strategies. Arizona, for instance, attracted significant semiconductor investment despite having relatively few state-specific incentives, while other states with more targeted programs saw different results.
“The amount of money that states spend on incentives doesn’t necessarily impact the number of semiconductor facilities in their state,” Albright said.
Rather than pointing to a single explanatory factor, the data suggested a more complex picture shaped by existing industry presence, federal investment timing, workforce capacity and historical development patterns. Importantly, the team avoided drawing causal conclusions that the data could not yet support.
“That complexity is an example of how capstone projects help students transition from classroom theory to real-world application,” Pallitto said. “The team had to decide what mattered and whether something would actually be useful to GAO. They had to think like policy analysts.”
In some cases, the students’ analysis revealed gaps the GAO hadn’t yet identified. “We had a lot of unknown unknowns,” said Kelsey Kennedy, senior analyst at the GAO. “This project helped surface things that weren’t on my radar before. If I had just reviewed the database quickly, I may not have come away with those conclusions. The case studies helped pull out what mattered and why it mattered to policymakers.”
Learning to work with imperfect information and uncertainty
The project required skills that went beyond technical analysis. Unlike most classroom coursework, there was no predefined dataset and no guarantee that the information they needed would be available or consistent across states.
“We had to decide what data and which variables we should include,” said Albert Yang (MSPPM ‘26). “The experience helped me a lot.”
Students also had to learn when to stop searching for data that simply wasn’t accessible. “With school assignments, you’re used to knowing there’s an answer,” Albright said. “Here, knowing when to stop researching was part of the job.”
The experience reshaped how several students think about public policy analysis–particularly the need to balance rigor with practicality when working under real-world constraints.