1. Systematic review: Does business coaching make a difference?
In PLOSOne, Grover and Furnham present findings of their systematic review of coaching impacts within organizations. They found glimmers of hope for positive results from coaching, but also spotted numerous holes in research designs and data quality.
Over the years, outcome measures have included job satisfaction, performance, self-awareness, anxiety, resilience, hope, autonomy, and goal attainment. Some have measured ROI, although this one seems particularly subjective. In terms of organizational impacts, researchers have measured transformational leadership and performance as rated by others. This systematic review included only professional coaches, whether internal or external to the organization. Thanks @Rob_Briner and @IOPractitioners.
2. Memory bias pollutes market research.
David Paull of Dialsmith hosted a series about how flawed recall and memory bias affect market research. (Thanks to @kristinluck.)
All data is not necessarily good data. “We were consistently seeing a 13–20% misattribution rate on surveys due in large part to recall problems. Resultantly, you get this chaos in your data and have to wonder what you can trust…. Rather than just trying to mitigate memory bias, can we actually use it to our advantage to offset issues with our brands?”
The ethics of manipulating memory. “We can actually affect people’s nutrition and the types of foods they prefer eating…. But should we deliberately plant memories in the minds of people so they can live healthier or happier lives, or should we be banning the use of these techniques?”
Mitigating researchers’ memory bias. “We’ve been talking about memory biases for respondents, but we, as researchers, are also very prone to memory biases…. There’s a huge opportunity in qual research to apply an impartial technique that can mitigate (researcher) biases too….[I]n the next few years, it’s going to be absolutely required that anytime you do something that is qualitative in nature that the analysis is not totally reliant on humans.”
3. Female VC –> No gender gap for startup funding.
New evidence suggests female entrepreneurs should choose venture capital firms with female partners (SF Business Times). Michigan’s Sahil Raina analyzed data to compare the gender gap in successful exits from VC financing between two sets of startups: those initially financed by VCs with only male general partners (GPs), and those initially financed by VCs that include female GPs. “I find a large performance gender gap among startups financed by VCs with only male GPs, but no such gap among startups financed by VCs that include female GPs.”
4. Sharing evidence about student outcomes.
Results for America is launching an Evidence in Education Lab to help states, school districts, and individual schools build and use evidence of ‘what works’ to improve student outcomes. A handful of states and districts will work closely with RFA to tackle specific data challenges.
Background: The bipartisan Every Student Succeeds Act (ESSA) became law in December 2015. ESSA requires, allows, and encourages the use of evidence-based approaches that can help improve student outcomes. Results for America estimates that ESSA’s evidence provisions could help shift more than $2B US of federal education funds in each of the next four years toward evidence-based, results-driven solutions.
Posted by Tracy Allison Altman on 27-Jul-2016.