As previously announced, the Office for the Advancement of Research & Scholarship is no more. We are currently in the process of updating our communications channels to reflect our new name: the Office of Research & Innovation.
In fact, you may have noticed some changes here in this very blog. The blog is now known as the Research & Innovation Report and it has a new URL: MiamiOHResearch.org. If you’re a subscriber, you don’t need to do anything to keep seeing posts in your inbox. And if you’ve bookmarked us, no need to worry about updating the link because the old URL redirects to the new one. But, if you tell anyone about us — and we hope you will! — it would probably be good to send them to the new URL. (And a heads-up that we will be redesigning the blog, so look for a fresh new appearance in the coming weeks.)
We’ve also updated our Twitter handle and created a new Facebook page.
If you already follow us on Twitter, thank you; there’s no need to do anything to keep seeing our tweets in your feed. You’ll just see they come from MiamiOH_ResInno, rather than MiamiOH_OARS. If you don’t already follow us but want to, or if you need to update a bookmark or want to invite a friend to follow us, you can find us at twitter.com/MiamiOH_ResInno.
Whether you followed OARS’ old Facebook page (which will no longer be supported) or would like to connect with us for the first time, we encourage you to like our new page at facebook.com/MiamiOH.ResearchInnovation.
We look forward to seeing you around!
Neon sign image by mstlion via Pixabay. Wave goodbye image by mohamed_hassan via Pixabay. Both used under Creative Commons license.
We recently updated the proposal writing workshop that we’ve been running for years to include more active learning elements. One of the activities we introduced — small group work to analyze sections of sample proposals — resulted in some unintended consequences.
As anyone who has scoured the internet for publicly available sample proposals can attest, they’re really not that easy find, especially if you’re interested in representing a range of disciplines and funding agencies. And hardly anyone posts the ones that did not result in funding, the ones that –to put it bluntly — failed.
So when we distributed sample proposals to workshop participants, they were all proposals that had resulted in awards. But, as participants soon discovered, that did not mean they were flawless. To be sure, there were good things going on in these proposals. But, in comparing these sample proposals to information we had provided about effective proposal writing, workshop attendees also found that the sample proposals fell short in a range of ways. Some failed to provide necessary context. Others lacked a clear needs statement or didn’t manage to articulate explicit connections between goals, objectives, activities, and evaluation. There were inconsistencies between narratives and budgets. There was tortured writing.
My colleagues and co-presenters, Amy Cooper and Anne Schauer, and I have reviewed countless proposals. We take it for granted that a proposal need not be, like Mary Poppins, practically perfect in every way to be funded. Perhaps to our shame, it didn’t occur to us let the workshop participants know that all of the examples we were sharing were successful in securing funding. I have to admit to having been a little gobsmacked then, to receive this request from a participant about halfway into the six-session workshop: “I wonder if it is possible to also [share] some successful proposals/good examples.” That’s what we’d been doing all along.
It turned out to be a teachable moment for both our team and the workshop participants. The lesson we learned as workshop facilitators was that we need to make our assumptions explicit, especially when providing guidance to beginners. The lesson we were able to share with the developing proposal writers is that there is no such thing as a perfect proposal.
The truth is that there are many of factors in grantmaking beyond what appears on the page. For better or worse, politics, professional reputations, and social networks all matter. Sponsors’ hidden values and practices matter. The need to balance research/project portfolios matters. These factors can all help explain why a flawed proposal might be funded, and many of them are beyond the PI’s control.
Our argument — the very reason we even offer a proposal writing workshop in the first place — is that it’s in every PI’s best interest to control the things they can, and that includes the writing. I don’t remember the source, but someone once put it this way: No amount of good writing will save a bad idea, but bad writing can doom a good idea. In analyzing some less-than-ideal — yet funded — proposals, our workshop participants discovered that the system will tolerate some imperfection. That should come as a relief to anyone who will ever submit a proposal.
Written by Heather Beattey Johnston, Associate Director of Research Communications, Office of Research & Innovation, Miami University.
Checkbox photo via Pxfuel. Dandelion photo via Peakpx. Both used under Creative Commons license.
For all the time, effort, and resources devoted to thwarting terrorism, it’s surprisingly difficult to get a complete view of sociopolitical violence in the United States. The Anti-Defamation League and the Southern Poverty Law Center monitor racist and white nationalist violence. The Center for Biomedical Research collects data on attacks against animal testing facilities. The University of Maryland tracks international incidents of terrorism in its Global Terrorism Database. But because none of these groups’ datasets interface with any other’s, information remains siloed and analysis of broader relationships is stymied.
Michael Loadenthal is trying to break down those silos. A visiting assistant professor of sociology and social justice studies at Miami University, Loadenthal directs the Prosecution Project, which seeks to understand how terrorism, extremism, and hate crimes are prosecuted in the U.S. justice system. The Prosecution Project’s dataset includes all crimes of political violence, without regard to the identity of the targets or the ideology of the perpetrators, so it paints a uniquely comprehensive picture.
“We’re looking to understand the patterns that exist between who a criminal defendant is, who commits crimes motivated by sociopolitical violence, and how that relates to the crime they committed and the way it’s prosecuted,” Loadenthal says.
Altogether, Loadenthal and his project team — which consists entirely of his current and former undergraduate students — account for 46 variables in each case they add to their dataset. Each case is coded by two members of the project’s coding team. After review of the initial coding by at least two senior members of the analysis team, the data are then reviewed by an auditor. So far, the team has fully coded and reviewed about 40% of the 5,000 cases they have identified. Once the dataset is fully processed, Loadenthal intends to make it available to the public.
Working toward that goal has required Loadenthal to figure a way around some technical roadblocks. One place he turned for help is Miami’s Research Computing Support group, particularly Greg Reese, senior research computing support specialist.
Among other things, Reese developed a custom audit program that Loadenthal says “helps machine some of the irregularity out of the data.” Reese’s program reads the data the project team has collected and checks it against a set of defined rules — about 30 in all — to find irregularities or incongruences. Certain mistakes, like an extra space typed after a defendant’s name, will cause a computer to classify data incorrectly. (A computer treats “John Jones ” — with a space after “Jones” — and “John Jones” — no space after “Jones” — as two different people, for instance.) Reese’s audit program searches for such mistakes and flags them so they can be corrected to improve the integrity of the overall dataset.
Loadenthal is grateful for Reese’s willingness not only to listen to the specific challenges the Prosecution Project faces, but also to develop custom solutions.
“I’ve seen Greg learn different aspects of new computer languages in order to code what we need,” Loadenthal says. “He’s taught himself new skill sets in order to accommodate us. Instead of trying to use something he’s already familiar with — let’s say C++ — he adapted and learned Python, which is better suited to what we’re doing.”
Reese’s impulse for inclusivity fits something of a theme for the Prosecution Project, which has remarkably diverse personnel. Although most members of Loadenthal’s 60-person team identify as female, they otherwise represent the gamut of student demographics and identities.
“We have a highly diverse research team that closely resembles the world off-campus,” Loadenthal says. “Our students represent a variety of races, ethnicities, nationalities, and religions. They don’t all conform to binary notions of gender. In addition, they represent a broad set of academic majors, from biology to English to political science to sociology.”
Loadenthal isn’t sure why the Prosecution Project’s team is so diverse. Diversity is more common in the social justice studies and upper-level sociology classes Loadenthal teaches than in the university as a whole, but beyond that, the team’s diversity doesn’t result from any active recruiting strategy. Students self-select to participate in the project — he waits for them to approach him.
The key to the team’s diversity may lie in diverse students’ inherent interest in finding answers to questions about systemic inequalities, which tend not to work out in their favor. Loadenthal says one of his early goals for the Prosecution Project was to explain sentencing disparities. He acknowledges that many members of his team expected to find that nationality, religion, and race play a role. And while they did find that African-, Asian-, and Middle Eastern-born, Muslim defendants with foreign-sounding names often receive harsher sentences than American-born, Christian defendants of European descent, they also found that simple xenophobia didn’t fully explain the differences. Loadenthal is committed to making Prosecution Project data available to other researchers who can help develop more nuanced explanations.
“This project is the only one that co-mingles and assimilates data points so you can make comparisons between ideologies and not restrict it to one particular movement,” Loadenthal says. “That’s the main goal, to ask questions that aren’t specific to one interest group.”
Written by Heather Beattey Johnston, Associate Director of Research Communications, Office for the Advancement of Research and Scholarship, Miami University.
Earlier this fall, there was a glitch in NIH’s electronic submission system electronic submission system, eRA Commons, that caused blank pages to appear in place of content in some grant application submissions. NIH attributed the error to PDF attachments that were generated from scanned documents, rather than text files.
Although the eRA Commons issue has now been resolved and our proposal facilitators, Anne Schauer and Amy Cooper, did not notice this error happening with any Miami proposals, we thought this was a good opportunity to issue a few reminders about submitting proposals, whether using Miami’s Cayuse system any other submission system.
It’s always best to generate PDFs from text files created in Word or another word processing program. Using scanned images to create PDFs should be avoided whenever possible.
It is the PI’s responsibility to ensure that the submission is complete and accurate. Your proposal facilitator reviews your application prior to submission, but because they don’t have expertise in your field, they won’t always recognize when something — a technical figure, for instance — has not rendered properly. You should always check your proposal in the sponsor’s system (e.g., NIH’s eRA Commons) following submission to verify that everything appears the way it should.
Many sponsors allow a period of time during which a PI may review and “fix” a submitted proposal. For example, NIH allows submissions to be reviewed, withdrawn, and resubmitted in eRA Commons for two days following submission. However, with many sponsors — including NIH — once the submission deadline has passed, no changes may be made to a proposal, even if the allotted review window has not yet passed. This is one of many good reasons not to wait until the last minute to submit a proposal. If you submit at the last minute, there may not be enough time for you to review the submitted proposal, let alone withdraw it, fix it, and resubmit it.
Digital screen glitch image by Rosa Menkman via Wikimedia Commons, used under Creative Commons license. Check and double check image from the National Archives at College Park, public domain.
I’m not one of those people who love Halloween, so I’m probably not really going to don a costume today. But if I were to dress up as a border collie, my experience as a research administrator would help me embody the character. How are research administrators (RAs) similar to border collies? Read on to find out.
Border collies are perhaps best known for their ability to keep a flock together and moving in the same direction. Much like border collies, research administrators help keep their communities moving toward strategic goals established by their institutions’ leadership. We develop policies and procedures that guide researchers in certain directions, we create checklists that keep faculty headed in the right direction with proposal submissions, and we design programs and incentives that encourage the pursuit of certain paths.
Border collies help keep their flocks safe from predators. Safety is also an essential function for research administrators. The policies and procedures we develop and enforce are created, in part, to protect our researchers and institutions from outside forces that could harm them. Our guidance documents and decision trees help investigators make informed decisions about things that impact their research. We work to ensure no one is caught unawares by disadvantageous sponsor terms or becomes subject to legal action as a result of inadvertently violating federal and state law. Those of us in research compliance protect the human and animal subjects that are part of our research communities by seeing that relevant federal guidelines are followed. Contract negotiators strategize to balance the interests of the investigator with those of the institution to arrive at an optimal final agreement.
Sheep may give more thought to the border collie’s potential to nip at their heels than they do to her potential to herd them to new grazing grounds, but that’s another essential function for these dogs. A flock that keeps grazing the same depleted pasture will not be nearly as robust as one that has access to fresh forage. The same is true for research, and RAs pave the way for new opportunities. Investigators often see ours as the office of “no,” but the truth is that RAs find creative ways to make “yes”s happen. Without the financial management expertise of post-award staff, far fewer sponsors would be willing to award funds in support of the research at our institutions. Without the keen eyes of pre-award staff, far more proposals would be returned without review for non-compliance. Without those who manage research compliance and review protocols, fewer studies would meet the ethical requirements of sponsors. And, of course, without the dedication and determination of research development staff, our institutions’ investigators would have access to – and be competitive for – far fewer opportunities.
With a border collie on the job, no sheep is ever left behind. The dog helps keep the flock intact by spurring on stragglers and rounding up those who have become lost. RAs also spur on stragglers. We search for those who are lost, reaching out to check on investigators we haven’t heard from in a while. We encourage those who got disappointing reviewer comments to take those comments to heart, tweak the proposal, and resubmit. Finally, we work to remove barriers for those who might not be submitting proposals at all.
Border collies use their experience and intuition to determine when one of their flock has crossed an unseen boundary and to recognize where their pasture ends and the next begins. Likewise, research administrators – whose unofficial motto is “it depends” – are relied upon to know where the boundaries are. Where does funding cross the line from gift to grant? When does one use a vendor vs. issuing a subaward agreement? What, exactly, are the allowable and allocable expenses that can be included in a budget?
Although border collies have a natural instinct for herding, it often takes a lot of time and effort to develop a dog’s skills so that she becomes a true partner. Likewise, research administrators devote a lot of time and effort to developing our skills. We are always thinking, learning, and growing so that we can own our place as true partners to our investigators and our institutions. Without us, the research enterprise would be more chaotic and less productive.
Adapted from “Research Without Border Collies,” written by Heather Beattey Johnston and Robyn Remotigue and appearing in the Oct./Nov. 2019 issue of NCURA Magazine.
One of the things we’ve heard frequently over the years from Miami’s more seasoned researchers is that they miss the days of the Old Miami Inn, where OARS would organize occasional get-togethers. It was an opportunity to hang out, with no particular agenda, but still manage to come away with new connections, and maybe even new collaborators.
Although we can’t bring back the Miami Inn, we are trying to revive the spirit of those informal gatherings with a series of program-free networking events. The first event took place last fall. Dubbed “Networking in the Club Lounge” for its location in Goggin Ice Center’s Club Lounge, it was a gathering of around 40 research-active faculty who enjoyed drinks and heavy appetizers on OARS while catching up with existing colleagues and getting to know new ones.
We are repeating that event again this fall. The Second Annual Networking in the Club Lounge event will be held this coming Tuesday, October 29, from 5:00 to 7:00pm. Registration is open through Monday.
In addition, we are organizing a series of no-host events at local restaurants. The first of these was held September 12 at Books & Brews in Oxford. It was sparsely attended, but the faculty who came were new to Miami and enthusiastic about their research and about meeting potential collaborators. We are hoping for stronger turnout at our next gathering, on Tuesday, November 12, from 5:00 to 7:00pm at Cru Gastro Lounge. No registration is required. Menus and pricing are available on the restaurant’s website.
What are your thoughts about these informal opportunities for research networking? If you’re a Miami University researcher, please let us know in the comments.
Networking image by Gordon Johnson via Pixabay. Wire connections image via Max Pixel. Both used under Creative Commons license.
This is the fourth in a series of updated posts (the others are here, here, and here) designed to help Miami University faculty, staff, and students learn to use SPIN to find potential sources of funding for their research, scholarly, and creative projects.
You do not need a profile to conduct searches or to set preferences and filters in SPIN. However, if you want to save your preferences/filters so that they are active when you access SPIN in the future, you will need a profile and you will need to login to SPIN each time you use it. For more about accessing SPIN and creating a profile, read this post.
Setting preferences and filters
Setting preferences and filters in SPIN helps target results to weed out opportunities that may not be relevant to your situation. For instance, if you are a faculty member, you could set your preferences to exclude opportunities for which only students are eligible to apply.
One preference we recommend most users change from the default is whether SPIN displays opportunities that may already be closed. By default, SPIN does not display these opportunities. That means that if a program has a deadline of March 1 and it is now March 15, SPIN will not display that program in its list of search results. However, it can be useful for researchers to be aware of these opportunities as they plan their future submissions, so we recommend changing preferences to include these opportunities. Instructions for making this change are outlined below and may serve as a model for making other adjustments to preferences so that search results are as relevant as possible.
To change the default “Closed Opportunities” preference:
Hover over Preferences in the black menu bar.
Select the preference you would like to adjust, in this case Closed Opportunities.
In the resulting pop-up window, use the drop-down menu to change the “Opportunities that have no documented future deadlines” field from “Exclude” to Include.
Click the Save and Exit button.
Once you have set a preference, you’ll see this text under the search box in any type of search: “You have additional filters active. Click here to edit them.” To edit multiple preferences at once:
Click on the Click here to edit them link.
On the “Current Settings” page, click on the Edit button in the top right corner of the group of preferences you want to adjust. (To remove all preferences/filters and start over with SPIN’s default settings, click the Reset Filters button near the top right of this page.)
In the resulting pop-up window, adjust your settings. Note that there are multiple tabs across the top: “Applicant Location,” “Applicant Type,” “Project Type,” “Project Location,” “Citizenship,” and “Sponsor Type.” You will select your preferences the same way you did in the keyword search – by dragging and dropping them from the left, options box into the right, “Chosen” box. (You can also select options and use the arrow button in the middle to move them to the “Chosen” box.)
Once you have your preferences adjusted, click the Save and Exit button in the lower right of the pop-up window.
If you need more help working with preferences and filters, watch this training video. (More training videos can be found in SPIN by hovering over Help in the menu bar and selecting Training Videos.)
The New Faculty Grant Planning and Support (GPS) program is a professional development program designed to support new tenure-track faculty in developing competitive applications for extramural funding programs. Specifically, the program:
Helps new faculty map out a plan for which funding opportunities to target in their first five years at Miami
Offers new faculty grantsmanship mentorship and support
New Faculty GPS consists of two phases.
Phase 1 – Individual Development Plan
In Phase 1, each participant works with an external consultant to create an individual development plan (IDP). The IDP will include goals for teaching, research, and service, and will emphasize external grant-seeking. IDPs are meant to be living documents that can grow and change as participants move through the early stages of their careers.
Phase 2 – Proposals for External Funding
Faculty who are selected to participate in Phase 2 will work one-on-one with a consultant-mentor to develop competitive proposals for external funding — one in each of their five years of participation. The consultant-mentor will provide a complete and comprehensive review of the draft application, and provide:
An overview of important elements of the proposal
Constructive criticism on the draft proposal
Guidance on exploring different options for the research agenda and other elements (e.g., education, professional development) that need to be integrated into certain proposals.
Each Phase 2 participant is expected to work with OARS to submit at least one proposal for external funding per year of participation and will submit a brief report to their dean and OARS annually.
Community meetings and other opportunities
Community meetings will be open to both Phase 1 and Phase 2 participants. All participants are expected to attend these meetings in their first two years of participation. Attendance is optional for those in their third through fifth years of participation. Meetings will be held approximately once a month during the academic year.
The overarching goal of these meetings is to build a community of support, so not all meetings will include formal programming. When formal programming is offered, topics will be selected by participants, and may include:
Talking to program officers
Developing proposal budgets
Developing broader impacts plans for NSF proposals
Tips/advice from funded researchers
Agency-, program-, or opportunity-specific information
Research-related intellectual property – publications and patents
Research ethics and integrity
Research computing support
Programming may be delivered by OARS staff, other Miami faculty or staff, the participating consultants, or other experts.
New Faculty GPS is not a writing workshop. However, faculty who would like additional peer support and accountability may choose to join other program participants in optional writing groups. Additional program-specific opportunities for networking and professional development may occasionally be offered, and participants are among the first to be notified about opportunities OARS makes available to Miami’s broader research community.
Results from the 2018-2019 cohort
In 2018-2019, we welcomed our first cohort of program participants. By the end of the academic year, 100% of them reported feeling more confident about future proposal submissions. A majority of participants also said they had or would apply to a “bigger” or more competitive program and that their proposals were of higher quality than they would have been without their participation in the program. The following were things participants mentioned especially liking about the program:
“The accountability and support”
“[The] accountability it fosters”
“Having a mentor to guide you in proposal writing”
“Access to an external consultant and time with OARS staff members”
Application for 2019-2020 cohort
New Faculty GPS is open to tenure-track faculty (including librarians) in their first or second year of appointment. All eligible faculty were emailed directly with an invitation to apply to the program. Any eligible faculty member who did not receive an email invitation should contact me at johnsthb@MiamiOH.edu or 9-1760 if they are interested in applying. Applications are due by 5:00pm on Monday, September 30.
GPS icon image by mohamed_hassan. Mentor/mentee image by Tumisu. Both via Pixabay. Both used under Creative Commons license.