A microphone in a studio.

Podcasts can help pass the social distancing time

An iPhone with earbuds next to a notebook and a pen.

Sure, remote instruction probably means you’re spending more time teaching, rather than less. And when you’re doing it from home, work has a way of expanding to fit the available time — especially if you’re trying to do it while also caring for children who are home from school or daycare. Still, as the coronavirus pandemic — and the requisite social distancing — stretches on, you’ll probably find yourself looking for ways to pass your time at home, and podcasts can fit the bill. Whether you’re new to the podcast renaissance or a devoted listener, you might want to give a listen to some of the following.

Miami podcasts

  • Major Insight showcases Miami students and how they transform academic subjects into lifelong passions.
  • Reframe, the original podcast from the College of Education, Health and Society (EHS), explores the transformative and progressive work being done across the university and throughout the community. Hear insightful interviews and exclusive stories about the faculty, students, and alumni who are addressing some of the most critical issues of our time.

Miami faculty podcasts

  • Chiropractic Science, hosted by associate clinical professor Dr. Dean Smith, gets the word out about chiropractic research. Chiropractors, patients and the public will learn about chiropractic science from the experts who are doing the research.
  • Stats and Stories, hosted by university distinguished professor John Bailer; professor emeritus Richard Campbell; and assistant professor Rosemary Pennington, uses stories to give statistics meaning and statistics to give stories credibility.

Other podcasts (recommendations via H-Net)

  • Backstory with the American History Guys is a public radio show and podcast hosted by U.S. historians Ed Ayers, Peter Onuf, and Brian Balogh, who give historical perspective to topics in the headlines.
  • Cold Call distills the Harvard Business School’s legendary case studies into podcast form. Hosted by Brian Kenny, the podcast airs every two weeks and features HBS faculty discussing cases they’ve written and the lessons they impart.
  • Everything Hertz goes everywhere the life sciences meet the biological sciences A bi-weekly conversation-style podcast with Dan Quintana and Dr. James Heathers, Everything Hertz explores the nuts and bolts of scientific research and academic life issues, like writing and publishing, the PhD to postdoc transition, and work-life balance.
  • In the Harvard Medical Labcast, Harvard Medical School scientists tackle a variety of important questions, ranging from how your neurons work to which genes play a role in particular diseases. This podcast provides context and highlights the latest trends in medical education and biomedical research through interviews and analysis.
  • Sidedoor is a podcast from the Smithsonian, produced and hosted by Tony Cohn and Megan Detrie. It tells stories about science, art, history, humanity and where they unexpectedly overlap.
  • Talking Machines is a podcast about the world of machine learning. Producer Katherine Gorman and Harvard School of Engineering and Applied Sciences Associate Ryan Adams speak with experts in the field about the latest research. Talking Machines is an independent production of Tote Bag Productions.

Microphone photo by Stock snap via Pixabay. iPhone photo via PeakPX. Both used under Creative Commons license.

Graphic of digital 1's and 0's on a high-tech looking background.

Data management plan resources are available

Screenshot of DMPTool.org. Links at top of page: Learn. Sign In. Title: DMPTool: Build your Data Management Plan. Ribbon: Welcome. Create data management plans that meet institutional and funder requirements. Get started [button]. Below image: DMPTool by the numbers: 28,787 users; 25,051 Plans [More link]; 229 Participating Institutions [More link]. Top 5 templates: NSF-SBE Social, Behavioral, Economic Sciences; DMP Template from DCC; Department of Energy (DOE): Office of Science; Digital Curation Centre; NIH-GEN: Generic. DMPTTool News: New DMPTool launched today [link]. Go to blog. Rss feed icon [link]. Links: About; Terms of use & Privact; Accessibility; GitHub; Contact us. Twitter and RSS feed icon links. Footer: DMPTool logo. DMPTool is a service of the University of California Curation Center of the California Digital Library. Copyright 2010-2018 The Regents of the University of California.

For some time, the NSF has required data management plans, and now the NIH has released a draft policy on making data sets used in NIH-funded research available to other researchers. (Read more about the new NIH policy from ScienceMag.org.)

Thankfully, resources for managing data are available to Miami faculty:

  • DMPTool.org allows you to create, review, and share data management plans that meet institutional and funder requirements.
  • Staff in the Center for Digital Scholarship are available for personalized reviews of data management plans prior to proposal submission.

To get started with DMPTool. org:

  • Navigate to DMPTool.org.
  • Click the big Get Started button in the middle of the screen.
  • Select Miami University (OH) from the drop-down list of institutions on the next page.
  • Click the green Next button.
  • Enter your Miami unique ID and password on the MUNet Login Page.
  • On the next page, click the green Create New DMP button and follow the prompts.

For questions about using DMPTool.org or to arrange a personalized review of your data management plan, contact Eric Johnson, Numeric and Spatial Data Librarian, Center for Digital Scholarship, King Library (513-529-4152).

Data image by By DARPA (Defense Advanced Research Projects Agency, via Wikimedia Commons, public domain.


Faculty converse at Research & Innovation's 10th Annual Proposals & Awards Reception.

Neither snow, nor rain, nor gloom of night stayed Miamians from the 10th Annual Proposals and Awards Reception

Biology’s Mike Robinson (left) chats with Scripps Gerontology Center’s Kate de Medeiros at Research & Innovation’s 10th Annual Proposals & Awards Reception.

It’s become something of a tradition for the weather to be less-than-ideal on the date of Research & Innovation’s Proposals & Awards Reception, and this year was no exception. Despite having experienced a very mild winter overall, the afternoon of February 12 began rainy and ended slushy and slippery. Still, close to 60 intrepid PIs, chairs, deans, and support personnel braved the elements to join Research & Innovation staff for drinks and appetizers in King Library’s AIS. In addition to temporary refuge from the increasingly solid precipitation, each attendee received a spiral notebook with an assortment of sticky notes and flags as a token of thanks from Research & Innovation.

Photos by Research & Innovation.

Round, metal perpetual calendar. Text: Place Year Over Month. For 55 Yrs. Calendar. 1970-2024. Red Month for Leap Year. Th F S S M Tu W. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 27 28 29 30

Deadlines and events coming up in March

Two pages from a spiral bound calendar, each partially visible. Text: 6 W. 7 T. 8 FR. 9 SA. 7 8 9 10 11. 7 8 9 10 11 12 13 14 15 16. 7 8 9 10 11 12 13 14 15 16 17 18 19. 7 8 9 10 11 12 13 14 15 16.

Be sure to check out the deadlines and events coming up next month:

March 10 . . . . . . . . . .  Informal research networking at Cru Gastro Lounge
March 11
. . . . . . . . . .  Human subjects/IRB application training
March 27 . . . . . . . . . .  Free NEH writing workshop and consultations at Ohio University

Perpetual calendar photo by Bryan Kennedy via Flickr. Paper calendar photo by photosteve101 via Flickr. Both used under Creative Commons license

Round, metal perpetual calendar. Text: Place Year Over Month. For 55 Yrs. Calendar. 1970-2024. Red Month for Leap Year. Th F S S M Tu W. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 27 28 29 30

Deadlines and events coming up in February

Two pages from a spiral bound calendar, each partially visible. Text: 6 W. 7 T. 8 FR. 9 SA. 7 8 9 10 11. 7 8 9 10 11 12 13 14 15 16. 7 8 9 10 11 12 13 14 15 16 17 18 19. 7 8 9 10 11 12 13 14 15 16.

Be sure to check out the deadlines and events coming up next month:

February 3 . . . . . . . . . .  Human subjects/IRB application training
February 4
. . . . . . . . . .  Human subjects/IRB application training
February 7
. . . . . . . . . .  Human subjects/IRB application training
February 11 . . . . . . . . . Human subjects/IRB application training
February 12 . . . . . . . . . Proposals & Awards Reception
February 17 . . . . . . . . . Presidents’ Day: Federal agencies are closed; Research & Innovation is open
February 20 . . . . . . . . . Human subjects/IRB application training
February 24 . . . . . . . . . Human subjects/IRB application training
February 25
. . . . . . . . .  Animal Care Program orientation

Perpetual calendar photo by Bryan Kennedy via Flickr. Paper calendar photo by photosteve101 via Flickr. Both used under Creative Commons license

Boy waving goodbye.

Out with the OARS and in with the new

Neon sign reads "NEW"

As previously announced, the Office for the Advancement of Research & Scholarship is no more. We are currently in the process of updating our communications channels to reflect our new name: the Office of Research & Innovation.

In fact, you may have noticed some changes here in this very blog. The blog is now known as the Research & Innovation Report and it has a new URL: MiamiOHResearch.org. If you’re a subscriber, you don’t need to do anything to keep seeing posts in your inbox. And if you’ve bookmarked us, no need to worry about updating the link because the old URL redirects to the new one. But, if you tell anyone about us — and we hope you will! — it would probably be good to send them to the new URL.  (And a heads-up that we will be redesigning the blog, so look for a fresh new appearance in the coming weeks.)

We’ve also updated our Twitter handle and created a new Facebook page.

If you already follow us on Twitter, thank you; there’s no need to do anything to keep seeing our tweets in your feed. You’ll just see they come from MiamiOH_ResInno, rather than MiamiOH_OARS. If you don’t already follow us but want to, or if you need to update a bookmark or want to invite a friend to follow us, you can find us at twitter.com/MiamiOH_ResInno.

Whether you followed OARS’ old Facebook page (which will no longer be supported) or would like to connect with us for the first time, we encourage you to like our new page at facebook.com/MiamiOH.ResearchInnovation.

We look forward to seeing you around!

Neon sign image by mstlion via Pixabay. Wave goodbye image by mohamed_hassan via Pixabay. Both used under Creative Commons license.

A gone-to-seed dandelion in a green lawn.

A proposal doesn’t have to be perfect to be funded

Three checkboxes on a piece of paper. Each checkbox has, respectively, a smiley, neutral, or sad face next to it.

We recently updated the proposal writing workshop that we’ve been running for years to include more active learning elements. One of the activities we introduced — small group work to analyze sections of sample proposals — resulted in some unintended consequences.

As anyone who has scoured the internet for publicly available sample proposals can attest, they’re really not that easy find, especially if you’re interested in representing a range of disciplines and funding agencies. And hardly anyone posts the ones that did not result in funding, the ones that –to put it bluntly — failed.

So when we distributed sample proposals to workshop participants, they were all proposals that had resulted in awards. But, as participants soon discovered, that did not mean they were flawless. To be sure, there were good things going on in these proposals. But, in comparing these sample proposals to information we had provided about effective proposal writing, workshop attendees also found that the sample proposals fell short in a range of ways. Some failed to provide necessary context. Others lacked a clear needs statement or didn’t manage to articulate explicit connections between goals, objectives, activities, and evaluation. There were inconsistencies between narratives and budgets. There was tortured writing.

My colleagues and co-presenters, Amy Cooper and Anne Schauer, and I have reviewed countless proposals. We take it for granted that a proposal need not be, like Mary Poppins, practically perfect in every way to be funded. Perhaps to our shame, it didn’t occur to us let the workshop participants know that all of the examples we were sharing were successful in securing funding. I have to admit to having been a little gobsmacked then, to receive this request from a participant about halfway into the six-session workshop: “I wonder if it is possible to also [share] some successful proposals/good examples.” That’s what we’d been doing all along.

It turned out to be a teachable moment for both our team and the workshop participants. The lesson we learned as workshop facilitators was that we need to make our assumptions explicit, especially when providing guidance to beginners. The lesson we were able to share with the developing proposal writers is that there is no such thing as a perfect proposal.

The truth is that there are many of factors in grantmaking beyond what appears on the page. For better or worse, politics, professional reputations, and social networks all matter. Sponsors’ hidden values and practices matter. The need to balance research/project portfolios matters. These factors can all help explain why a flawed proposal might be funded, and many of them are beyond the PI’s control.

Our argument — the very reason we even offer a proposal writing workshop in the first place — is that it’s in every PI’s best interest to control the things they can, and that includes the writing. I don’t remember the source, but someone once put it this way: No amount of good writing will save a bad idea, but bad writing can doom a good idea. In analyzing some less-than-ideal — yet funded — proposals, our workshop participants discovered that the system will tolerate some imperfection. That should come as a relief to anyone who will ever submit a proposal.

Written by Heather Beattey Johnston, Associate Director of Research Communications, Office of Research & Innovation, Miami University.

Checkbox photo via Pxfuel. Dandelion photo via Peakpx. Both used under Creative Commons license.


Round, metal perpetual calendar. Text: Place Year Over Month. For 55 Yrs. Calendar. 1970-2024. Red Month for Leap Year. Th F S S M Tu W. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 27 28 29 30

Deadlines and events coming up in December

Two pages from a spiral bound calendar, each partially visible. Text: 6 W. 7 T. 8 FR. 9 SA. 7 8 9 10 11. 7 8 9 10 11 12 13 14 15 16. 7 8 9 10 11 12 13 14 15 16 17 18 19. 7 8 9 10 11 12 13 14 15 16.

Be sure to check out the deadlines and events coming up next month:

December 6 . . . . . . . . . .  Nomination deadline: Distinguished Scholar and Junior Faculty Scholar Awards
December 11
. . . . . . . .  OARS/Research & Sponsored Programs Office Hours in Armstrong Student Center
December 24-31 . . . .  OARS/Research & Innovation closed
December 25 . . . . . . . . Christmas Day: Federal agencies closed

Perpetual calendar photo by Bryan Kennedy via Flickr. Paper calendar photo by photosteve101 via Flickr. Both used under Creative Commons license

Michael Loadenthal reviews work performed by Prosecution Project team members at a work session

The Prosecution Project aided by Research Computing Support group

Prosecution Project team members (from left) Sarah Carrier, Olivia Sellegren, Meekael Hailu, and Morgan Demboski, discuss data at a work session.

For all the time, effort, and resources devoted to thwarting terrorism, it’s surprisingly difficult to get a complete view of sociopolitical violence in the United States. The Anti-Defamation League and the Southern Poverty Law Center monitor racist and white nationalist violence. The Center for Biomedical Research collects data on attacks against animal testing facilities. The University of Maryland tracks international incidents of terrorism in its Global Terrorism Database. But because none of these groups’ datasets interface with any other’s, information remains siloed and analysis of broader relationships is stymied.

Michael Loadenthal is trying to break down those silos. A visiting assistant professor of sociology and social justice studies at Miami University, Loadenthal directs the Prosecution Project, which seeks to understand how terrorism, extremism, and hate crimes are prosecuted in the U.S. justice system. The Prosecution Project’s dataset includes all crimes of political violence, without regard to the identity of the targets or the ideology of the perpetrators, so it paints a uniquely comprehensive picture.

“We’re looking to understand the patterns that exist between who a criminal defendant is, who commits crimes motivated by sociopolitical violence, and how that relates to the crime they committed and the way it’s prosecuted,” Loadenthal says.

Altogether, Loadenthal and his project team — which consists entirely of his current and former undergraduate students — account for 46 variables in each case they add to their dataset. Each case is coded by two members of the project’s coding team. After review of the initial coding by at least two senior members of the analysis team, the data are then reviewed by an auditor. So far, the team has fully coded and reviewed about 40% of the 5,000 cases they have identified. Once the dataset is fully processed, Loadenthal intends to make it available to the public.

Working toward that goal has required Loadenthal to figure a way around some technical roadblocks. One place he turned for help is Miami’s Research Computing Support group, particularly Greg Reese, senior research computing support specialist.

Among other things, Reese developed a custom audit program that Loadenthal says “helps machine some of the irregularity out of the data.” Reese’s program reads the data the project team has collected and checks it against a set of defined rules — about 30 in all — to find irregularities or incongruences. Certain mistakes, like an extra space typed after a defendant’s name, will cause a computer to classify data incorrectly. (A computer treats “John Jones ” — with a space after “Jones” — and “John Jones” — no space after “Jones” — as two different people, for instance.) Reese’s audit program searches for such mistakes and flags them so they can be corrected to improve the integrity of the overall dataset.

Loadenthal is grateful for Reese’s willingness not only to listen to the specific challenges the Prosecution Project faces, but also to develop custom solutions.

“I’ve seen Greg learn different aspects of new computer languages in order to code what we need,” Loadenthal says. “He’s taught himself new skill sets in order to accommodate us. Instead of trying to use something he’s already familiar with — let’s say C++ — he adapted and learned Python, which is better suited to what we’re doing.”

Reese’s impulse for inclusivity fits something of a theme for the Prosecution Project, which has remarkably diverse personnel. Although most members of Loadenthal’s 60-person team identify as female, they otherwise represent the gamut of student demographics and identities.

“We have a highly diverse research team that closely resembles the world off-campus,” Loadenthal says. “Our students represent a variety of races, ethnicities, nationalities, and religions. They don’t all conform to binary notions of gender. In addition, they represent a broad set of academic majors, from biology to English to political science to sociology.”

Loadenthal isn’t sure why the Prosecution Project’s team is so diverse. Diversity is more common in the social justice studies and upper-level sociology classes Loadenthal teaches than in the university as a whole, but beyond that, the team’s diversity doesn’t result from any active recruiting strategy. Students self-select to participate in the project — he waits for them to approach him.

The key to the team’s diversity may lie in diverse students’ inherent interest in finding answers to questions about systemic inequalities, which tend not to work out in their favor. Loadenthal says one of his early goals for the Prosecution Project was to explain sentencing disparities. He acknowledges that many members of his team expected to find that nationality, religion, and race play a role. And while they did find that African-, Asian-, and Middle Eastern-born, Muslim defendants with foreign-sounding names often receive harsher sentences than American-born, Christian defendants of European descent, they also found that simple xenophobia didn’t fully explain the differences. Loadenthal is committed to making Prosecution Project data available to other researchers who can help develop more nuanced explanations.

“This project is the only one that co-mingles and assimilates data points so you can make comparisons between ideologies and not restrict it to one particular movement,” Loadenthal says. “That’s the main goal, to ask questions that aren’t specific to one interest group.”

Written by Heather Beattey Johnston, Associate Director of Research Communications, Office for the Advancement of Research and Scholarship, Miami University.

Photos by Miami University Photo Services.


Image improperly rendered on digital screen.

eRA Commons glitch prompts reminders about preparing proposals and checking submissions

Order forms and packing box, with text reminding viewer to compare numbers on forms and box to ensure they match. "Check here. And here. Be sure it's right. Check and double check!"

Earlier this fall, there was a glitch in NIH’s electronic submission system electronic submission system, eRA Commons, that caused blank pages to appear in place of content in some grant application submissions.  NIH attributed the error to PDF attachments that were generated from scanned documents, rather than text files.

Although the eRA Commons issue has now been resolved and our proposal facilitators, Anne Schauer and Amy Cooper, did not notice this error happening with any Miami proposals, we thought this was a good opportunity to issue a few reminders about submitting proposals, whether using Miami’s Cayuse system any other submission system.

Reminder 1

It’s always best to generate PDFs from text files created in Word or another word processing program. Using scanned images to create PDFs should be avoided whenever possible.

Reminder 2

It is the PI’s responsibility to ensure that the submission is complete and accurate. Your proposal facilitator reviews your application prior to submission, but because they don’t have expertise in your field, they won’t always recognize when something — a technical figure, for instance — has not rendered properly. You should always check your proposal in the sponsor’s system (e.g., NIH’s eRA Commons) following submission to verify that everything appears the way it should.

Reminder 3

Many sponsors allow a period of time during which a PI may review and “fix” a submitted proposal. For example, NIH allows submissions to be reviewed, withdrawn, and resubmitted in eRA Commons for two days following submission. However, with many sponsors — including NIH — once the submission deadline has passed, no changes may be made to a proposal, even if the allotted review window has not yet passed. This is one of many good reasons not to wait until the last minute to submit a proposal. If you submit at the last minute, there may not be enough time for you to review the submitted proposal, let alone withdraw it, fix it, and resubmit it.

Digital screen glitch image by Rosa Menkman via Wikimedia Commons, used under Creative Commons license. Check and double check image from the National Archives at College Park, public domain.