| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Webinar on Shared Measurement--Questions and Answers

This version was saved 13 years, 11 months ago View current version     Page history
Saved by Stephanie Cubell
on April 7, 2010 at 1:31:55 pm
 

 

Webinar Questions, Answers, and Discussions

 

FSG's webinar on Shared Measurement Evaluation Systems on March 24, 2010 (with guest hosts from Strive and the Cultural Data Project) prompted excellent questions and comments from more than 900 live participants. We have compiled the more than 140 questions we received during the webinar. We look forward to seeing your thoughts and responses in order to continue the exchange on approaches to shared measurement for large scale social impact. Please help us continue the dialogue by participating in these questions and answers!

 

Here is a list of categories that our discussion and questions broadly fell into. Please scroll down to see the Q/A within each category. We encourage you to continue the conversation by adding your thoughts or perspectives as well. Please add your first name or initials, and the organization or type of organization you represent after your comments. There is also a Comments section at the bottom of the page. Thank you!

 

WEBINAR DISCUSSION/QUESTION CATEGORIES:


 

Start-Up/Operations/Sustainability

 

Q: What is the sustainability model for Shared Measurement Systems?

A:  (Provided by FSG) There are a few business models for Shared Measurement Systems:

  1. Subscription: Success Measures Data System uses a subscription model to sustain its operation. Participating organizations (e.g. community development non profits) pay $2,500 in annual subscription fee. Participating organizations can also purchase a one-time coaching and training package starting from $7,500. Typically, funders subsidize the one-time coaching and training fees while the non profits pay the ongoing annual subscription.
  2. Supported by funder participants of the system: In the case of the Cultural Data Project, funders pay the $400/group cost of participating in the system.
  3. Grant funded: Systems like Strive and Pulse have sustained themselves by raising grant funding from multiple sources and sustain themselves that way.

 

Q: Is it likely that the cost of developing Shared Measurement Systems reduce as more and more of these systems are developed?

A:  (Provided by FSG) There are two broad categories of costs involved in developing Shared Measurement Systems:

  1. Researching and developing the indicators, facilitating a collaborative and participative process to come to agreement on the indicators, etc.
  2. Technology costs to build the system.

The second category, technology costs, especially technologies that might be especially useful for Shared Measurement Systems will continue to drop. The first category of costs will likely reduce as outcome indicators for the most common issues are developed so less time has be spent in research and development - however, the cost involved (mostly time and effort) in building agreement to the indicators will likely not reduce since these activities need to be conducted each time such a system is built. On the other hand, as these systems become more commonplace, participants might be more willing to collaborate and the process of coming to agreement on the indicators might be simplified.

 

Q: Please share the sustainability plans for these systems, e.g. if Pew is managing the CDP now as a project, what will happen at the end of the project? Will the system continue and how?

A: (Provided by CDP) Speaking for the Cultural Data Project and Pew, we and our partners in Pennsylvania and other states see the CDP as an ongoing project and we ask our funding partners to make long-term commitments to participation. For Pew and our many partner funders, we see support for the Cultural Data Project as an ongoing cost of doing business, more efficient than hiring outside consultants to collect data on our applicant pools for periodic evaluations, and also a way of providing philanthropic support to our grantees who use the CDP reporting tools to build their organizational and planning capacity. For Pew as project manager, we expect to continue operating the CDP on behalf of our partners as long as they believe we are adding value to the project.

A: (Provided by Strive) Strive has established a sustainability committee to develop a plan for the local sustainability of the Strive partnership.  In addition to raising funds for the staffing and operations of the partnership, we are also looking at potential partnerships with other regional efforts to help build ongoing sustainability.

 

Q: Development costs were covered, but what does it cost to operate CDP and Strive? Are they self sufficient? At what point do economies of scale kick in to reduce per use costs?

A: (Provided by CDP) The annual operating cost, which is based on an at-cost model in each state, varies significantly from state to state as it depends upon several factors including number of participating cultural organizations in the state and number of participating funders. We do see cost-efficiencies coming into play by the third year of operations in each state, and some economies of scale as the project grows nationally. We are also looking at further efficiencies to be gained through a regional approach (e.g. in New England), especially in regions where most organizations are small and in rural areas.

A: (Provided by Strive) Strive's annual operating budget is around $1 Million.  In working with other communities to create similar partnerships we have determined that such a partnership can be staffed for around $500,000 if the scope of work that is undertaken is more streamlined and focused, and local resources are leveraged.

 

Q: For the cultural data project, what are the sustainability plans for the data system? Will Pew continue to provide staffing and TA support indefinitely? Or is there an expectation that participating organizations will carry the cost of the system in the future?

A: (Provided by CDP) The CDP is currently operating under a business plan that takes us through 2014 and envisions that Pew will continue to operate the project on behalf of our partners at least through that year.  We will continue to operate it as long as our partners find that we add value to the project. Our financial model is based on ongoing financial support by the diverse base of funders in each participating state, and we talk with all prospective partners about the need for such support of the CDP to be considered an ongoing cost of doing our business — more cost efficient than hiring outside consultants for periodic analysis of applicant pools, for example. The investment in the CDP is a very cost-effective way for funders to support the organizational learning and development of their grantees, as well. We currently do not envision a changed business model that would charge cultural organizations to input their data into the CDP profile.

 

Q: What is the approximate cost for a state to launch CDP?

A: (Provided by CDP) The cost per state varies based on several key factors, primarily the number of cultural organizations that are likely to participate and the number of funders that participate. We are able to gain cost efficiencies over time in each state within which we work.  In recent months we have also been exploring ways to drive down costs by working in a regional context (e.g. the six New England states).

 

Q: How did you get the initial funding for this work?

A: (Provided by Strive) Strive was fortunate that KnowledgeWorks Foundation, a national education foundation based in Cincinnati, Ohio agreed to support initial operations for the effort.  Other local and national foundations have since come to the table to support the work. 

 

Q: For the systems that have ongoing live support (i.e. people who are ensuring the data is accurate) what are the annual operating costs?

A: (Provided by Strive) Annual operating costs for Strive are around $1 Million, but we are leveraging an additional $500,000 or so in in-kind support from the business community (loaned executives from GE Aviation) and higher education who support the data collection and analysis work we are doing.

 

Q: How much did it cost to create Strive?

A: (Provided by Strive) Strive's annual operating budget is around $1 Million.  Since Strive's inception in 2006, KnowledgeWorks Foundation has invested approximately $2.5 Million in operations and general support, the Bill & Melinda Gates Foundation invested $400,000 to support communications and community engagement work and local funders invested an additional $500,000 in support and pass through funds to support networks and partners of Strive.  In addition, we have leveraged more than $2 Million in in-kind support through loaned staff, research and other resources from partners.  However, in our work with partners across the country through the Living Cities partnership, we are helping other cities create similar partnerships with more focused scopes of work for less funding and more along the lines of a $300-$500K annual operating budget. 

 

Q: What process was used with different stakeholders (and who were the key stakeholders?) to arrive at the single set of outcomes?

A: (Provided by Strive) We established a data committee comprised of local data experts and key representatives from the various sectors (i.e. secondary education, higher education, business, philanthropy and community). This committee established a set of criteria for selecting indicators, which became a lens by which the committee could review potential indicators. This process took more than six months and in the spirit of continuous improvement, this committee continues to meet and make decisions about removing or revising indicators that have been selected. Several changes have been made to the indicators since the baseline year in 2008 as a result of this Committee's work.

 

Q: What did the planning process or exploratory phase of these projects require? How can we support this in grantmaking?

A: (Provided by Strive) From Strive's perspective, especially as we try to roll out this type of culture change and use of data for decision making in cities throughout the country, it would be helpful for grantmakers, who are calling for more collaboration and evidence-based program improvement, to reward this type of use of data and collaboration in their grantmaking efforts. It has been our experience that so many grantmakers say they want to see better use of data and evidence of impact, yet continue to support legacy programs with little demonstrated impact. Or, they want to support the next innovative idea, without providing ongoing support for initiatives that get result. We are asking nonprofit providers for culture change, but this means the funding community will need to change the way it does grantmaking, as well. In addition, grantmakers can help build the capacity of non-profit organizations to do this type of data collection and analysis, and support partnership efforts like Strive to provide the collective technical assistance to community stakeholders.

 

Back to Top

 

Implications for Strategy

 

Q: If agencies are measuring common outcomes does this mean their strategies or interventions need to be similar or the same?

A:  (Provided by FSG) The short answer is No. For example, two organizations involved in workforce development could be measuring the same outcome e.g. job retention after 6 months. However, depending on the population they are targeting their strategies might be completely different - a program targeting harder to serve individuals like those recently released from prison might have longer programs with more emphasis on soft-skills. In contrast a workforce organization serving recently unemployed populations might focus more on shorter, skill-based training. In both cases however, having a common outcome indicator (in this case job retention after 6 months) is useful to ensure all organizations are working towards a commonly defined outcome goal.

 

Q: Does Strive use their roadmaps to help NFPs adjust their programs to reduce redundancy or even to advocate mergers of those with large overlaps on type of service and targets of service?

A: (Provided by Strive) One of the first steps a Strive network of providers takes is the mapping of services and assets in attempt to align them with need in the community. This process and the Strive Six Sigma action planning process has resulted in the identification of redundancies and duplication of services, and as a result of the collaborative action planning process and the work with funders to support such plans, we are working toward more effective and efficient use of resources.

 

Q: In the strive model, how do you analyze with the stakeholders from the community the accelerators and inhibitors of success (or failure). This is as interesting for the future planning than the results themselves.

A: (Provided by Strive) The Report Card of high-level indicators is more of an annual check-in to see how we are doing, but the real data analysis happens with the networks on-the-ground. Strive is providing hands-on technical assistance with the networks of providers to look at the data and better understand what it means for program improvement and how to direct resources based on data.

 

Q: I would like more information on Strive and how data were shared for this project-was it at the level of a shared platform or at the level of an adaptive learning model.

A: (Provided by Strive) Strive is more of an adaptive learning system, as we are working with networks of on-the-ground providers to determine common indicators as part of a shared measurement and accountability system.

 

Back to Top

 

Participation and Incentives

 

Q: What incentives do Shared Measurement Systems offer to encourage and enhance collaboration?

A:  (Provided by FSG) One incentive some funders of Strive are considering is funding of an entire Student Success Network (collection of agencies that work together to act on particular interventions on the Student Success roadmap e.g. early childhood education) instead of funding particular agencies within a network. In this way, the whole network is held accountable for the achievement of specific outcomes and have an incentive to collaborate with each other to achieve this end. We found through our research that while the promise of funding can be a motivator to encourage agencies to enroll in a collaborative effort, it cannot be the sole motivator of continued participation. The single biggest reason agencies stay in a collaborative is because it actually helps them do their work better - when agencies realize this, their collaboration becomes voluntary and far more productive.

 

Q: How can organizations help funders move in this direction?

A:  (Provided by FSG) Having funders read this FSG report, attend speaking sessions on the subject is a way to begin educating funders of the possibilities of shared measurement. Putting interested funders in touch with FSG for further conversations can help further deepen their understanding. Having funders speak to other funders participating in one of the Shared Measurement Systems featured in this report is one way for funders to learn about why other funders are doing this and what benefit they have seen.

 

Q: What has been the most important factor behind getting the funders to understand the point of measuring a collective effort rather than individual grants?

A: (Provided by Strive) This has not been easy. On one hand we have had funders, because of decrease in overall resources, calling for greater collaboration and investment in shared action plans, but then they continue to make grants in the same way they have always made them. We are fortunate to have had some early adopters of this type of grantmaking from our corporate community and the early results have spoken for themselves. Our hope is that other funders will follow suit.

 

Q: Who are the CDP folks for Ohio?

A: (Provided by CDP) Please see www.ohculturaldata.org for information about our funding partners in Ohio.

 

Q: Is the cultural data project voluntary or is now funding of org by private dollars based on participation?

A: (Provided by CDP) Most organizations do input their data as part of a grant application as required by their funders. However, it is possible for any organization to submit its own data to the system voluntarily and thereby to be able to generate trend reports on its own activity.

 

Q: When you say the Cultural Data Project is working with a state, with whom exactly are you working: state arts commission? community foundation? other?

A: (Provided by CDP) We do, as you suggest, work with state arts agencies and with community foundations, private foundations and some corporations. We also work with local arts agencies and local and state level advocacy organizations, so our working group of partners in each state is varied and robust.

 

Q:  For CDP - How many participating orgs are funder partners? Have any changes been made to the data being collected as a result of the involvement of a new funder partner?

A: (Provided by CDP) Currently the CDP has about 150 participating funders across all the states in which we work. We do not change the CDP profile at the request of only a single funder. We have developed a process and protocols for making changes periodically based on several criteria, including broad agreement from funders and cultural organizations in our system that a change would both add significant value to the profile and to research that could be conducted using the new information, and agreement about the specific nature and definition of each data point we would add. We will also from time to time consider whether we should delete data points that are not being used broadly but have not done so yet.

 

Q:  What barriers are you finding among the teachers? Can you drill down to the individual classroom? If not, will you at some point?

A: (Provided by Strive) We are just beginning to get into the work of teaching and learning in the classroom.  Last year, Strive and several of our partners sponsored The New Teacher's Project to conduct research on Cincinnati Public Schools teaching practice last year and there were five findings

  1. revamp teacher evaluation practices
  2. retain best teachers through performance-based compensation
  3. turn around schools by bringing more effective teachers into schools with the most need
  4. help less effective teachers improve through professional development and if they don't improve, implement a more streamlined process for removal
  5. optimize supply of new teachers by hiring early and from teacher education programs with proven records of effectiveness.

We are advocating that these recommendations be considered within the context of contract negotiations that are happening now in the district. All of the Strive partners, from business leadership to colleges of education, are interested in improving outcomes for teachers and will work to be able to measure what's happening in the classroom as we build strategies in this area.

 

Q:  What is Strive's definition of a community? Can I assume that Strive's outcomes (%) are within one school district?

A: (Provided by Strive) The Strive Partnership is describing our "community footprint" as the urban core of Greater Cincinnati/Northern Kentucky, which includes three urban school districts -- Cincinnati Public Schools (OH), Newport Independent Schools (KY) and Covington Independent Schools (KY), as well as the parochial schools within the urban core. Our Striving Together Report Card reports on outcomes for all three districts and selected parochial schools. 

 

Q:  Can any of the panelists address overcoming resistance from practitioners who believe that shared learning outcomes are not realistic to create?

A: (Provided by Strive) Strive has found some resistance within groups of like providers, who provide similar services (i.e. mentoring) using different methods (i.e. group vs. one-on-one) and thus have resisted the process to determine a shared set of outcomes. This has been a challenge, as we are trying to avoid making a value judgment over one type of mentoring vs. another, given the diverse needs of various populations of students; however, without agreeing to a core set of outcomes, comparison of the strength and validity of indicators becomes inevitable and the value judgment is put into play. This is just one example, but it is an ongoing struggle when you are dealing with so many different providers.

 

Q: How did you incentivize ongoing data input and collection from your volunteer partners?

A: (Provided by Strive) This has been the greatest barrier to success for Strive  We should have raised a pool of "incentive" funds at the outset of this work to be able to do this. The networks doing this work who have been the most successful and moving the furthest fastest had dedicated funding support organized for them to do this work. Those networks without "incentive funds" have struggled. They believe in the work, but capacity is an issue. We are in the process of raising a more general "incentive fund" to help with this.

 

Back to Top

 

Data Collection/Reporting/Revisions

 

Q: How is qualitative data collected and analyzed in Shared Measurement Systems?

A:  (Provided by FSG) Shared Measurement Systems emphasize the importance of qualitative data. The Success Measures Data System allows participant organizations to measure the impact of their work by providing a broad range of tested qualitative and quantitative data collection instruments to measure outcome indicators. A specific example of how qualitative data is used is the Wachovia Regional Foundation that sponsors its grantees' use of Success Measures. Wachovia brought twenty-two grantees together in November 2008 to look at the results of one of the indicators – the Resident Satisfaction Survey – which includes qualitative and quantitative data. They found an interesting commonality in the results – the number one thing that residents liked about their neighborhoods was the friendliness of their neighbors – there was a positive correlation between sense of friendliness and feelings of safety. This was true across the region in different low income communities. Based on this information, Wachovia was convinced that support for community building programs (so-called "soft funding" that many funders are reluctant to provide) was in fact very important. Shared Measurement Systems are also careful to ensure that quantitative data is interpreted correctly by considering the qualitative context within which that measurement is observed. The California Partnership for Achieving Student Success (Cal-PASS) system, a K-16 data-sharing platform supports sixty-seven Professional Learning Councils (discipline-specific groups of faculty and staff across the K-16 continuum) to reflect on their data and discuss implications for curriculum and instruction. Participants view these meetings as essential to distill meaningful lessons from the comparative data.

 

Q: How are performance measures in Shared Measurement Systems customized, revised, and extended?

A:  (Provided by FSG) Shared Measurement Systems we studied are all built to allow for customization, revisions and extensions over time. The Pulse system developed jointly by Acumen Fund, Google and Salesforce and targeted at investors of social enterprises is an example that allows for customization. The key component of the Pulse system is the investment profile, which stores data about the amount and structure of an individual investment, as well as the performance metrics for that investment.  For each new investment, a portfolio manager can choose from a list of existing metrics or create new metrics, which are then added to the universal list. Similarly Success Measures provides the structure that allows each affiliate organization to choose either standard or custom indicators. Success Measures is also a good example of a system that is continually extending itself. For example, at the time of writing this report, Success Measures planned to augment its current menu of indicators, adding 15 new outcome indicators focused on foreclosed, real estate-owned and vacant and abandoned properties.  Success Measures was also developing new tools to measure the impact of various programs along the asset continuum (e.g., financial education, asset building, etc.), to measure the value of services provided by intermediaries (e.g., training, technical assistance, etc.), and was researching opportunities to incorporate affordable green building principles into its outcome measurement system.

 

Q: What is the most effective way to train organizations on how to correctly interpret, use and learn from the data in these systems?

A:  (Provided by FSG) Shared Measurement Systems like Success Measures offer a comprehensive training package with on-site coaching and training starting at $7500 - in this way participating agencies are trained to correctly use the system so that they can do so without depending on Success Measures in the long term. There are different ways in which Shared Measurement Systems enable usage and learning from the system. Within NeighborWorks America whose network of community development organizations use Success Measures for example, clusters of organizations get together at regular intervals to review and discuss the results and learn from it. Strive has the most structured process we found in our research for enabling learning. Strive has adapted GE's Lean Six Sigma process to help Student Success Networks (action networks composed of 10-15 organizations addressing specific interventions on Strive's Student Success roadmap framework e.g. early childhood education) collectively learn from data collected. Please refer to the Strive case study in FSG's Breakthroughs in Shared Measurement report for more information.

 

Q: Do the CDP clients only report on their financial data? Or do they also report on their non-financial impacts?

A: (Provided by CDP) Yes: organizations report programmatic and operational data on their most recently completed fiscal year in addition to financial data. Examples include information about contributions; general admissions and participation by school children; exhibitions, productions and events; number and size of performance, exhibition and other facilities; website usage; etc. You can see the specific questions by going to www.pacdp.org and clicking on Data Profile Instructions on the left hand side in the HELP section.

 

Q:  Does participation in CDP fulfill or integrate with the requirements of reporting to the IRS?

A: (Provided by CDP) The CDP collects financial data based on Generally Approved Accounting Principles and FASB rules, rather than on the information required by the IRS Form 990, which collects much less financial information than the CDP and asks questions such as the salaries of key individuals (and, in the new 990, questions about the existence of conflict of interest and whistle-blower policies, to name two) that the CDP does not collect. While the 990 deals with tax compliance issues, the CDP is focused on organizational performance and the ability to track and learn from trends and benchmarking with other organizations. Also, organizations can fill out their CDP profile as soon as they have closed their books on their previous fiscal year (with a board approved audit or financial statements, if they are not audited), which means the information in the CDP can be more timely than in their 990 submissions. We are currently doing some analysis of the new 990 form to see if it would be helpful to the CDP’s constituent organizations to provide some kind of fact sheet that would show which data points in the CDP could be transferred directly onto their 990 filing.

 

Q:  We are currently running the 40 developmental assets survey in our schools; why is it being debated in your region?

A: (Provided by Strive) We are in the process of developing a different measure for being "supported in and out of school" because it is costly and labor intensive to administer the Developmental Asset survey on an annual basis. We believe the Development Assets survey is a good assessment of this type of "support" but we are working with our partners, in particular, the United Way, to determine whether there are other assessments that could be more effective in this area. The debate is really around how we are defining what it means to be "supported" and how that can be assessed.

 

Q: Just curious -- what led Strive to set the bar for developmental assets at "more than 20?"

A: (Provided by Strive) It wasn't exactly scientific. Our data committee considered that the data from the Developmental Assets Survey is broken into quartiles and found that nearly half of young people surveyed nationally report having less than 20 of the 40 Developmental Assets. Thus, the Committee determined that "more than 20" is a good benchmark for this measure, since there was not a benchmark already established for what it means to be "supported." Not an exact science, and another reason why Strive has had some challenges with this measure.

 

Q: Question for STRIVE - Do any of your programs target the school-to-work transition and if yes, how do you measure those outcomes?

A: (Provided by Strive) The Strive Youth Career Access Network (CAN) focuses on youth (ages 16-21) transitioning from school to work and has several outcome measures as follows: 1) Percent of employers that rate youth as meeting 90% of expectations/requirements. (The team uses a slightly modified Department of Labor's checklist in a survey for measuring the outcome) 2) The number of youth who retain employment for 30/60/90 days. 3) Percent of youth who complete a summer employment program.  In addition, this team also measure other outcomes leading to successful transition such as: obtaining a HS diploma/GED and enrolling in advanced training (vocational, college, etc.).

 

Q: How does the Strive student dashboard compare or contrast with the National Dropout Prevention Center's? Is it adequate to prevent or reduce grade retention in a timely fashion, and thereby increase graduation rates?

A: (Provided by Strive) The focus of Strive’s Learning Partner Dashboard is more about creating a central database system for partners to track the services they are delivering to students and begin to assess the impact of these services. The Learning Partner Dashboard expands on an existing teacher dashboard application that the Cincinnati city school district has in place to track academic success, including measures of attendance, behavioral incidents, state tests and benchmark tests. This teacher dashboard also serves as a tool to be able to track student grades and other measures of success on a real-time basis to be able to determine those students who need targeted interventions.

 

Q: Do the collaboratives have data beyond the high-level outcomes indicator to inform their "data-driven decision-making," e.g. does the early childhood ed. collaborative have more data than k-readiness scores?

A: (Provided by Strive) Yes, part of the Strive Six Sigma action planning process is the Measure Phase, in which the individual networks of providers determine a common set of measures -- qualitative and quantitative, as well as process and outcome -- for their on-the-ground work. So, for early childhood, for example, they are actually looking at eight measures and holding the network members accountable for these eight measures, as they think about how to make investments based upon data. The kindergarten readiness measure is just the high-level, ultimate outcome.

 

Q: Do Strive member agencies focus on the stated outcomes/indicators only - or do members have other outcomes/indicators specific to themselves?

A: (Provided by Strive) Those ten outcome measures that were discussed are only the high-level outcomes measures.  Each of the networks of providers are establishing their own set of process and outcomes measures that relate to their individual areas of work.  This is the "Measure" phase of the Strive Six Sigma action planning process and ultimately what the networks of providers hold themselves accountable for in order to make program improvements and resource decisions based on data.

 

Q:  For joint programs - like some of TRIO - that reflect partnerships between schools and community or university, how do different constituencies use the same data? Do they share common conclusions, e.g., is an intervention as timely for a school as it may be for a college or community partner? And might they be timely in the same or different ways?

A: (Provided by Strive) Not sure whether I fully understand this question, but I can answer that TRIO and GEAR UP programs are participants in several of the Student Success Networks we have established through Strive, including the College Access Network, Retention Network and even the Mentoring and Tutoring Networks. The partnerships between higher education and secondary schools helps to facilitate better sharing of data, especially as it's easier to get the student participants in these programs to sign a waiver to make their individual data available. In addition, these programs are typically externally evaluated and data collection and analysis is part of the culture. What we have found, however, is that oftentimes the indicators that are identified for measurement within these types of programs are more along the lines of process versus outcomes, and so this effort and the collaborative measurement identification and data collection has helped these programs to expand the indicators they are reviewing as part of their evaluation efforts. If this is not what you were asking, please contact me at blatzj@strivetogether.org.  

 

Q: Did Strive encounter resistance in obtaining student's test scores and was privacy part of resistance?

A: (Provided by Strive) The data that Strive is reporting is aggregate and widely available to the public. In the on-the-ground work that we are helping to facilitate with networks of providers there have been cases in which privacy issues have arisen and we work with those providers and the districts to obtain the permissions necessary to maintain compliance with FERPA regulations. It's not always possible for providers to access student-level data because of privacy concerns, but we try to work with them and the districts to get them what they need for their action planning work, while protecting the rights of students.

 

Q: How do you overcome privacy issues related to student data? Are agencies willing to share data openly with one another?

A: (Provided by Strive) We work very hard to establish close working relationships with the School Districts and providers in order to maneuver the privacy issues, while staying in compliance with FERPA. In some cases, this means aggregating data, requesting that students/families sign waivers or removing identification information from the data in order to use it. The Partner Dashboard that is being created will enable partners to provide data on the services provided and then get linked student academic achievement data in return. However, in order to remain compliant with privacy laws, if a student/family has not signed a waiver to allow access to this data, the data will be returned to the partner in either an aggregate or blinded format.

 

Q: Can STRIVE share more about the report card? What does it include exactly?

A: (Provided by Strive) The Striving Together Report Card is available at our website: www.strivetogether.org. It is an annual community-level report on indicators across the five Strive goal areas. It's primary purpose is to establish a baseline and benchmarks, while tracking trends along a set of high-level outcomes as a barometer for the community to hold the Strive partnership accountable for education improvement.

 

Q: Another question for Strive: What data underpins the analysis of the "enters career" part of Goal Five? The indicators all seem to be about the PS achievement and attainment part of the effort.

A: (Provided by Strive) This has been an ongoing struggle.  It's a case where no good data on "entering a career" is currently available. We learned, through this process, that colleges do not collect this data in a consistent manner. We wanted to include some data that was available from the Ohio Board of Regents on "career attainment" but because the data was inconsistent it was judged to not meet the criteria for inclusion by the data committee that determines what's in or out of this Report Card. The discussion about needing to include something in this area is ongoing and this is a question that comes up all of the time, so we will continue to look at ways to include measures in this area, as the post-secondary attainment data does not cover it.

 

Q: It would seem that State Departments of Education would be interested in the type of data generated by STRIVE?

A: (Provided by Strive) In Ohio we have talked with our State Department about the work we are doing with Cincinnati Public Schools to establish a Partner Dashboard in an effort to connect the work of the many providers to student achievement. They are very much interested in the outcomes of this pilot phase of the dashboard, as it could change the context of data on student achievement that is currently available.

 

Q: What connection is there between this level of data consolidation and the Secretary Duncan's RTTT required consolidations? and how do you handle issues of confidentiality between health and education?

A: (Provided by Strive) The Race to the Top guidelines have discussed the importance of developing longitudinal, cradle to career, student-level data systems at the state level in order to better facilitate the use of data in program improvement. This type of data system, which does not currently exist in Ohio or Kentucky, would make the type of data collection and analysis we are trying to do at the local level much easier. At the local level we are cobbling together data from several different systems. We are also building a Learning Partner Dashboard that links data from partners (such as mentoring, tutoring, college access, after-school programs) to student-level academic data for use in program assessment and evaluation. To answer your question related to gaining access to health data. HIPPA permits us to access this type of data only in aggregate form, but we are able to work with the school districts who can make the linkage from health to academic data  and then match that with participation data in various programs in order to get a better picture of the impact of student health on academic achievement. 

 

Q: Can you still benchmark your organization’s performance against the data from other states in the Cultural Data Project if your state is not participating?

A: (Provided by CDP) The CDP is an open system in that any organization could go into one of our states’ sites and enter and submit their data. An organization could then run both trend and comparison reports but the organization’s own data would not be part of any research project.

 

Back to Top

 

Reliability of Data

 

Q: Would be interested to expand on the reliability of the data. How did you mitigate any possible inconsistencies? Thanks.

A: (Provided by CDP) CDP has several layers of error-checking and review. First, a lot of automated error checking is built into the system, such that data profiles can only be submitted when, for example, numbers are totaled correctly (this applies to the 70 percent of data points that related to organizations’ audits or financial statements). Second, CDP’s Help Desk staff review every profile submitted for other possible anomalies, based on a lengthy checklist; cultural organizations are contacted with queries about each such anomaly and asked to clarify whether an error has been made or if there is another explanation. When errors have been made, the organization itself is asked to correct them (CDP does not enter or edit data in any profile). Third, our system provides extensive training and education about the process of filling out the profile, on line, through webinars and in-person trainings, and through our help desk. Fourth, because organizations have access to 77 trend and comparison reports using their own data, there is an incentive for them to get their own data correct so that the reports are useful to them.

 

Q: How do you ensure the quality and validity of the data collected in Shared Measurement systems?

A:  (Provided by FSG) All Shared Measurement Systems have some ongoing infrastructure in place to ensure that the validity, quality, and integrity of data that is collected in the system. In the case of the Pulse system, social enterprises do not input their data directly - Acumen ensures data integrity by requiring portfolio managers to review enterprises’ data before it is entered into the system. The Cultural Data Project has two levels of data validation. The online Data Profile that is the cornerstone of the system is comprised of eleven sections that collects information about everything from basic organizational identification to detailed financial data and performance attendance statistics.  Every organization participating in the Data Project completes the form on an annual basis (though data in some sections may remain the same from year to year).  Upon completion of the form, the web-based system automatically checks the data for common errors (e.g., failure to correctly total balance sheet items) and allows users to make corrections.  Once the final profile is submitted, Data Project staff review the data to ensure that its accuracy and integrity.

 

Q: There are audited financial statements that do NOT reflect real costs allocated correctly. Historic data does not make it correct.

A: (Provided by CDP) We agree and have learned that by virtue of filling out the Cultural Data Project profile many organizations have discovered that their audits do a less than adequate job of representing their organizations. In a number of cases the result has been that organizations have changed the format of their audits, and even have changed auditors. It has been our experience that many auditors are not experienced at preparing statements for nonprofit organizations and may not be correctly reflecting the finances and condition of organizations, and the CDP has helped to raise this as a governance issue for organizations in participating states.

 

Q: How do you ensure that the data gathered doesn't result in attempts to improve certain outcomes (e.g., increasing ACT scores by teaching to the test and emphasizing route memorization) at the expense of other important outcomes (e.g., supporting and developing children's innate love of learning)? It seems that the myriad efforts to improve education with increased standards, standardized measures and testing has driven some incremental improvements in student performance, but at the expense of many key 21st century skills (lifelong love of and interest in learning, critical thinking, curiosity, adaptability, teamwork, etc.).

A: (Provided by Strive) Great question! This comes up frequently and what is unique about the Strive Partnership is that the partners are committed to the idea that not all learning happens within the "little red schoolhouse," which is why our foundational document, the Strive Roadmap includes both Academic and 21st Century Skills and our many networks working to support the schools are focused on the student/family support aspect of learning that is so critical. The problem is establishing a way to effectively measure these 21st Century Skills. We are able to do this most effectively at the network level and we remind the community that our Report Card outlines the high-level measures and what is most important is what is happening on the ground.

 

Q: CDP asks for manual data entry from agencies once a year; does Strive (or any other system) provide individual agencies with their own data collection capacity, with uploads to a central repository?

A: (Provided by Strive) We are working on developing what we are calling the Learning Partner Dashboard in collaboration with Cincinnati Public Schools that will enable us to do this. Essentially partners will input their data and it will be linked with student-level academic data from the school district's data warehouse. This initiative is currently in pilot phase. In addition Strive has piloted the use of a Google Apps site as a shared platform for organizations to submit program level data. Strive has also supported the development of early childhood and preschool data systems, which are currently underway in both Cincinnati and Northern Kentucky. 

 

Back to Top

 

Relationship to Other Systems

 

Q: Is NEFA's culturecount part of the CDP? (http://nefa.org/grants_services/culturecount_database)

A: (Provided by CDP) CDP are indeed working closely with NEFA on an effort to expand CDP participation regionally to all New England States. NEFA plans to use CDP data from Massachusetts in future research, and is interested in being able to bring on the other New England states in order to have comparable data about all its constituent states.

 

Back to Top

 

Shared Measurement in International or Other Sectors

 

Q: Are there examples of Shared Measurement Systems being used in the international arena?

A:  (Provided by FSG) The Monitoring & Evaluation Reporting & Integration Tool (MERIT) developed by the Nonprofit Organization Knowledge Initiative (NPOKI) is an example of a system used in the international health arena by organizations like International Planned Parenthood, International AIDS Vaccine Initiative, Management Sciences for Health and others. MERIT is a collection of tools and processes used to collect and report on for example PEPFAR indicators.

 

Q: Any plans to extend this to other countries?

A: (Provided by CDP) We have received queries about this from time to time but currently our focus is on expanding participation as broadly as possible throughout the United States. Use of the CDP in other countries is certainly a long-term vision of ours but would be farther into the future. 

 

Q: What examples are there of Shared Measurement systems in healthcare, poverty, human services, community development and others?

A:  (Provided by FSG) Our research studied over 20 Shared Measurement Systems operating in different issues areas - please see the final appendix of FSG's Shared Measurement report for full list. A few examples include Center for What Works/Urban Institute's Indicators Project defines common outcome indicators for 14 Health and Human Services program areas including Adult Education and Family Literacy, Emergency Shelters, and Prisoner Re-entry. The Monitoring & Evaluation Reporting & Integration Tool (MERIT) developed by the Nonprofit Organization Knowledge Initiative (NPOKI) is used in the international health arena by organizations like  International Planned Parenthood, International AIDS Vaccine Initiative, Management Sciences for Health and others to collect and report on for example PEPFAR indicators. Public/private Venture's Benchmarking Project is focused on workforce development while Cal-PASS is focused on K-16 education. 

 

Q: Do you work with educational organizations as well?

A: (Provided by CDP) CDP does work with organizations whose focus is arts education; we have not extended the CDP beyond the arts and culture sector to the broader nonprofit world, yet, although we have long-term interest in seeing that happen.

 

Q: Did I understand that the CDP was now used by non-arts organizations? If not, is there a plan to do this?

A: (Provided by CDP) The CDP is not currently being used by non-arts organizations but this is a query we get fairly frequently. On the one hand, the CDP itself is focused on the arts and culture sector and is likely to remain so; on the other hand we are eager to see this tool used more broadly and have done some due diligence with partners who might take on the task of adapting the CDP profile for use by other nonprofits. Currently we are working with the City of New York on an effort to adapt the CDP for use with the City’s human service organization grantees. We know this is a special concern of community foundations, and have worked with the community foundations that are our partners in participating states to make the CDP work for them and their arts grantees. We’d be happy to discuss this further with you if that would be of interest.

 

Q: Can you cite examples of shared measurement programs that involve scholarship providers?

A: (Provided by Strive) Strive has organized a "scholarship" network in which we mapped the various scholarship resources in the region and discussed shared measures, such as first to second year retention and ultimately graduation from college. This "scholarship" network, through its action planning efforts, found there to be a misalignment between the need for scholarships and the resources available to students in the urban core and recommended the establishment of a scholarship program to support low-income students. The four higher education institutions participating in this scholarship initiative have agreed to a common set of measures and will report data on the program regularly and collectively.  

 

Back to Top

 

Process for Continuous Learning

 

Q: Could you talk a little more about Strive 6 Sigma?

A: (Provided by Strive) Strive Six Sigma is a set of tools that we use with our networks of providers to help them do collaborative action planning.  Based on GE's Lean Six Sigma, this is a process for driving continuous improvement that can lead to more efficient and effective use of resources. There are five phases -- Define, Measure, Analyze, Improve and Continuously Improve. There is more information about the process and its outputs on our website www.strivetogether.org, but in summary it's a way to get groups of providers on the same page and using a common language as it relates to evidence-based decision making and driving resources to what works for kids.  We work with GE to train community stakeholders in the Strive Six Sigma process so that they can effectively use the tools in their work. 

 

Q: Can you share documentation the Strive Six Sigma for Social Institutions?

A: (Provided by Strive) Because of the proprietary nature of this work, we have established a mutual agreement with our partner, GE Aviation, to only share this information with partners with whom we are working directly, at this time. We deliver trainings in Strive Six Sigma in collaboration with GE and these are currently free an open to the public. We share copies of the "text" that has been developed with those individuals who participate in the trainings. 

 

Q: Does Strive use a common online platform as part of their adaptive learning process? If "no," why not?

A: (Provided by Strive) Our networks use a number of online tools/platforms to do their work, including Google Sites and Basecamp; however, we have found that the type of collaborative action planning that needs to happen in order to agree upon a common set of measures, determine gaps and align services, must, at least initially, happen in frequent in-person meetings. These online tools can then be used as an effective method of communication in between meetings. 

 

Q: Can Strive make their Six Sigma training materials available to others?

A: (Provided by Strive) At this point we have an agreement with GE Aviation to make the materials available only to those with whom we are working directly. We do provide trainings in Strive Six Sigma, in collaboration with GE, on a regular basis in Cincinnati and if we are able to build our capacity to do so, we hope that one day we will be able to provide these materials and trainings to a much broader audience. 

 

Back to Top

 

COMMENTS

 

 

Comments (0)

You don't have permission to comment on this page.