Newsletter Subject

The Edge: ‘The Educational Equivalent of Energy Star’

From

chronicle.com

Email Address

newsletter@newsletter.chronicle.com

Sent On

Wed, Oct 13, 2021 11:00 AM

Email Preheader Text

How can prospective students see the value of colleges’ and other providers’ academic and

How can prospective students see the value of colleges’ and other providers’ academic and work-force-training programs? ADVERTISEMENT [Academe Today Logo]( Did someone forward you this newsletter? [Sign up free]( to receive your own copy. I’m Goldie Blumenstyk, a senior writer at The Chronicle covering innovation in and around academe. Here’s what I’m thinking about this week. Could this measure of quality become the “educational equivalent of Energy Star”? The acronym EQOS is one only a wonk could love. And the organization’s ambition — to establish a uniform and useful student-centered system to measure the value of academic and work-force-training programs — comes up against some big hurdles (not least the mishmash of data on student outcomes). So the Education Quality Outcomes Standards board faces some tough odds. But the fact that it exists at all is a testament to the power of its mission (and its appeal to funders to give it a lifeline). I [first wrote about the model six years ago]( when it was a fledgling idea for an alternative to traditional accreditation aimed primarily at coding boot camps and other postsecondary providers besides colleges, and then again [as the proposal evolved](. Today, three partnerships with states and a series of other projects have taken the idea “beyond theoretical,” says Kristin Sharp, the group’s chief executive. But, she’s quick to acknowledge, “just barely.” ADVERTISEMENT SUBSCRIBE TO THE CHRONICLE Enjoying the newsletter? [Subscribe today]( for unlimited access to essential news, analysis, and advice. I continue to see merit in this approach. All the current talk about alternatives to college and the potential infusion of billions of federal work-force-training dollars creates even more urgency to see which programs are worth attending. An effective system for that, Sharp told me, could help “get people away from dead-end jobs.” Need another reason? Consider that an organization called Credential Engine continues to develop a [directory of educational credentials]( that now numbers nearly one million but provides little information. to help people assess which programs are valuable and which are duds. Sharp says that ultimately EQOS hopes to let colleges and other providers use tools that make results visible to prospective students — like “the educational equivalent of Energy Star,” she says. That point feels a long way off, though, even if Sharp says EQOS hopes to develop the institutional tools in 2022. The [EQOS model]( calls for measuring programs, academic and otherwise, on five criteria: learning, completion, placement, earnings, and student satisfaction. Each would be measured by an established standard that could be audited or verified by an outside party. Unlike accreditation, the process isn’t meant to assess an institution’s quality, but rather, to evaluate programs and, as Sharp puts it, their “impact on a student’s life.” I was curious to understand what obstacles stand in the way. Here’s some of what Sharp told me. Finding the right data is hard. Heck, even deciding what data to collect — on, say, earnings — can be a complex calculation. Do you measure participants’ salary changes one year after they completed a program? Three? What happens when that information isn’t readily available? EQOS has been working on evaluations of job-training programs in Colorado and Indiana, for example, two states known for having rich salary data. But even there, the records aren’t complete. So often “you’re cobbling together sources,” says Sharp. It would make sense, she says, to collect outcomes data from institutions offering the programs, but their systems “were never set up that way.” (Sound familiar? In the spring, the Project on Workforce at Harvard University also [found scant evidence of outcomes assessment]( among 300-plus training programs it studied.) Showing what students learned isn’t any easier. While assessing “satisfaction” is doable — EQOS often relies on the [Net Promoter Score]( for that — measuring the skills and knowledge students gained (or didn’t) “is its own beast,” Sharp told me. The EQOS philosophy calls for institutions themselves to identify their programs’ educational objectives, the criteria for success, and how to verify that those criteria have been met. That allows for flexibility and institutional autonomy, but it seems to make it hard to routinize the evaluation. Nothing compels institutions to join this voluntary effort. The institutions EQOS has worked with (including [a Denver coding boot camp I once profiled]( believe in highlighting their outcomes, Sharp says, and “want to have a way to do it systematically.” But unlike accreditation, which is required for participating in federal student-aid programs, this process doesn’t have such a lever (although [Sharp has argued for one](. The model doesn’t seem as applicable to more-traditional college programs. I’m eager to see results from an EQOS pilot begun in August with seven two-year and four-year colleges in New Jersey. State officials there told me this week that they’re using the project to hone their own evaluations. “We’ll take what works and leave what doesn’t,” said Brian Bridges, the secretary of higher education. The state is especially interested in measuring how educational programs affect social mobility. The New Jersey experiment could also answer some of my questions about whether colleges will consider evaluations like this too burdensome to bother with. Bridges, despite being a data guy himself, says he’s mindful of that. The state won’t adopt new standards, he said, just “for the sake of adding another accountability metric.” Officials there hope to report out results after the project concludes, in February. More colleges wanted to join the pilot, said Annie Khoa, a senior adviser to Bridges, but EQOS didn’t have the capacity. “We were very surprised,” she said, “by how much interest this got.” Recommended reading. Here are some education stories from other outlets that recently caught my eye. Did I miss a good one? Let me know. - Rural schools have their challenges, but they can also be “sites of learning, community, and excellence,” [two education scholars write in]( Daily Yonder](. That’s often overlooked, with “tragic consequences.” - “In an era of viral digital disinformation,[eroding governance norms,]( increased political violence, the same old campus ‘civic engagement’ programs no longer seem sufficient,” [EdSurge reports](. Now colleges are rethinking their efforts. - A new study shows that the lack of internet access in the United States is stark in rural Southern regions with higher Black populations, and as [Thomson Reuters Foundation reports]( experts say “that dynamic amplifies existing ‘structural racism.’” Correction. Last week, in writing about Brandman University becoming UMass Global, I incorrectly described the role of a new online-leadership group at the University of Massachusetts system. It won’t be overseeing the rollout of UMass Global, but does meet regularly to discuss ways of collaborating with that institution on matters like transfer agreements, complementary academic programs, and marketing plans, and on advancing online education across the entire system. Got a tip you’d like to share or a question you’d like me to answer? Let me know, at goldie@chronicle.com. If you have been forwarded this newsletter and would like to see past issues, [find them here](. To receive your own copy, free, register [here](. If you want to follow me on Twitter, [@GoldieStandard]( is my handle. Goldie’s Weekly Picks ENROLLMENT [Virtual Tours Could Get More First-Generation Students to College. Here’s What They Want to See.]( By Taylor Swaak [STORY IMAGE]( With the fall college-tour season in progress, many prospective students, especially those unable to travel, are scouring institutions’ websites and social media to find the right fit. SPONSOR CONTENT | LONDON METROPOLITAN UNIVERSITY [The future of US-UK relations in the balance.]( STUDENTS [Do ‘Inclusive Access’ Textbook Programs Save Students Money? A New Site Urges Everyone to Read the Fine Print]( By Taylor Swaak [STORY IMAGE]( Advocates of open education resources are concerned colleges are buying into the model without knowing for sure whether it’s actually saving their students money. CAMPUS HEALTH [This University Announced a Vaccination Mandate. Now It’s Not So Sure.]( By Nell Gluckman [STORY IMAGE]( In a rare move, the University of Akron may be walking back a policy that requires students and employees to be vaccinated against Covid-19. SPONSOR CONTENT | Microsoft [Adopting a Zero-Trust Security Model]( Cyber attacks in higher ed have skyrocketed during the pandemic. Read to learn about the measures institutions are taking to safeguard their digital assets. ADVERTISEMENT FROM THE CHRONICLE STORE [The New Academic Workplace]( [The New Academic Workplace]( Assess flexible-work options for your staff and keep a competitive edge with this report full of expert advice and case studies, including data from a Chronicle survey of administrators and human-resource professionals. [Visit the store to order your copy.]( JOB OPPORTUNITIES Apply for the top jobs in higher education and [search all our open positions](. NEWSLETTER FEEDBACK What did you think of today’s newsletter? [Strongly disliked]( | [It was ok]( | [Loved it]( This newsletter was sent to {EMAIL}. [Read this newsletter on the web](. [Manage]( your newsletter preferences, [stop receiving]( this email, or [view]( our privacy policy. © 2021 [The Chronicle of Higher Education]( 1255 23rd Street, N.W. Washington, D.C. 20037

Marketing emails from chronicle.com

View More
Sent On

25/05/2024

Sent On

24/05/2024

Sent On

24/05/2024

Sent On

23/05/2024

Sent On

23/05/2024

Sent On

23/05/2024

Email Content Statistics

Subscribe Now

Subject Line Length

Data shows that subject lines with 6 to 10 words generated 21 percent higher open rate.

Subscribe Now

Average in this category

Subscribe Now

Number of Words

The more words in the content, the more time the user will need to spend reading. Get straight to the point with catchy short phrases and interesting photos and graphics.

Subscribe Now

Average in this category

Subscribe Now

Number of Images

More images or large images might cause the email to load slower. Aim for a balance of words and images.

Subscribe Now

Average in this category

Subscribe Now

Time to Read

Longer reading time requires more attention and patience from users. Aim for short phrases and catchy keywords.

Subscribe Now

Average in this category

Subscribe Now

Predicted open rate

Subscribe Now

Spam Score

Spam score is determined by a large number of checks performed on the content of the email. For the best delivery results, it is advised to lower your spam score as much as possible.

Subscribe Now

Flesch reading score

Flesch reading score measures how complex a text is. The lower the score, the more difficult the text is to read. The Flesch readability score uses the average length of your sentences (measured by the number of words) and the average number of syllables per word in an equation to calculate the reading ease. Text with a very high Flesch reading ease score (about 100) is straightforward and easy to read, with short sentences and no words of more than two syllables. Usually, a reading ease score of 60-70 is considered acceptable/normal for web copy.

Subscribe Now

Technologies

What powers this email? Every email we receive is parsed to determine the sending ESP and any additional email technologies used.

Subscribe Now

Email Size (not include images)

Font Used

No. Font Name
Subscribe Now

Copyright © 2019–2024 SimilarMail.