Back to overview
September 22, 2025

The Science of Impact: A Conversation with Transfr’s VP of Intelligence Yun Jin Rho

We recently sat down with Yun Jin Rho, VP of Intelligence at Transfr, to discuss how her team approaches impact measurement and management. Yun Jin shared insights on the role of data in defining success, how impact is embedded across Transfr’s operations, and the balance between rigor and practicality in research. The transcript below has been slightly redacted for clarity.

No items found.

Could you introduce yourself and describe your role at Transfr, including how impact fits into the work you do?

I'm the VP of Intelligence at Transfr. I lead our educational research and analytics functions and connect them to all areas of the business. As our team name implies, our work deals with a wide range of data and information sources so that we can develop a comprehensive understanding of our trainees and educators. Through this work, we define and measure the impact that matters to our customers and their community, and we evaluate ourselves against the impact we measure constantly.

Given your academic background in data science, what personally brought you into the world of impact?

Personally, I’ve always been interested in how individuals define success, and the stories behind those definitions. With an academic background in both mathematics and psychology, it felt natural to move into quantifying and measuring success, especially within the education space. At a company like Transfr, defining and measuring outcomes is a core part of how we approach impact management. What really drew me into this work was the realization that everyone has a success story worth telling—but not everyone gets the opportunity to realize it. That understanding, and the belief that people should be able to define and redefine their own success, was the turning point that brought me into the world of impact.

How does Transfr broadly approach impact and impact measurement?

We start by understanding our customers — our trainees and educators — through our research and analytics. We collect both quantitative and qualitative data and we conduct efficacy studies and case studies to measure our product and program impact on training outcomes including engagement, motivation, job readiness and job placement. Since there are multiple stages and touch points in the trainee journey, we perform customer behavior analytics to closely and frequently examine their interactions with the product and programs. Through this process, we sometimes identify ideal and non-ideal behaviors and the customer segments. Based on these insights, we develop intervention strategies to maximize the impact on customer success.

How has Transfr’s impact approach evolved over time?

The types of outcomes that we measure, and how we measure those outcomes — that’s a very complicated process. Also the granularity on how we measure the outcomes and impact has changed. In terms of collecting both quantitative and qualitative data, we made a lot of improvements in volume of data, accuracy of data, and continuity of data. The overall maturity of data has improved, and that’s the critical component of the impact measurement and management.

How do you balance scientific rigor with practical constraints when conducting research?

As scientists, our desire is to design and conduct rigorous research and analytics, but we need to also work within constraints such as limited resources and limited time and data availability. Depending on the constraints, we need to choose the right method and also the right types of study design. As we introduce and consider all these factors in the process of collecting data and evidence, these constraints become even more complex. In that case, we try to use proxy measures that still represent the customer behaviors reasonably well. When the experimental designs or lab studies are not feasible, then we can consider quasi-experimental studies or correlational studies. There are multiple ways we can have a balance between vigor and practicality.

Impact clearly plays a cross-functional role. Which teams at Transfr do you partner with?

We work closely with the customer success, product and engineering teams, using our analytics and learning research findings to inform the design and improvement of our VR simulations. By partnering with these teams, we’re able to use insights, including data from the impact report, to support customer renewal conversations. But it doesn’t stop with current customers. These insights also play a key role in conversations with prospective clients. We can leverage the data to identify ideal and non-ideal behaviors, as well as define the customer segments where we’re most effective. That helps the customer success team manage relationships more strategically and efficiently.

We also work with the finance team, using impact metrics as part of a data-driven strategy for investment and fundraising opportunities. It’s truly a cross-functional effort — impact isn’t the responsibility of a single team. We all work together.

What areas of improvement are you most excited to dig into in the next year?

Our primary and most important outcome is job placement and its economic impact on both the community and society. We’re focused on tracking our trainees from the classroom into career pathways to ensure they are job-ready. That requires significant work and coordination. Right now, we’re building the data systems needed to support this effort. It will take time, but it’s one of the areas I’m most excited to develop over the next year.

What are some common misconceptions or difficulties around impact measurement?

A common misconception is that the impact measurement team alone is responsible for defining and measuring impact. That’s not true — it requires collaboration across many teams in the company. Another challenge is the assumption that one type of data is enough to answer complex questions. For example, when we talk about job placement or job readiness, we're not referring to a single outcome. That’s not sufficient to capture a person’s holistic readiness for employment. One type of data, or one methodology, simply isn’t enough to fully define and measure the impact we're aiming to understand.

How do you and your team stay current with best practices in IMM (Impact Measurement and Management)?

Our team brings a mix of backgrounds and varying levels of expertise, and we stay current by actively engaging with both academic and policy communities. We keep up with academic conferences and publications to ensure our impact measurement aligns with the latest research. But it's not just about the academic side. We also follow developments in public policy and market trends. This helps us align our impact management practices with both rigorous standards and real-world needs. For example, we pay close attention to emerging job trends so we can better prepare our trainees for the opportunities that lie ahead.

What is one piece of advice you would offer to someone who is either building their career in impact or building it out at their company for the first time?

Start with a clear understanding of your customers and what truly matters to them. That’s the foundation for defining impact. From that understanding, you can build a framework that is both rigorous and flexible enough to account for real-world constraints to measure impact.

The harder part is improving impact. That requires integrating impact metrics into cross-functional work and embedding them into the company’s decision-making processes. It’s the only way to move beyond simply defining or measuring impact — and toward actually improving it in a meaningful way.

Connect?

Interested in Lumos Capital as a partner?