Dots used to represent data points.

Professional development opportunities for research data management available

If you are among the many researchers who are using the down time created by COVID-19-related curtailment of research for professional development, you might want to check out the data management resources below. The list was compiled by the Association of Public and Land-grant Universities (APLU) and the Associate of American Universities (AAU) as part of an ongoing collaboration on public access to research. The APLU’s Council on Research, which distributed the list, offered special thanks to Utah State University; Lisa Johnston and Jim Wilgenbusch at University of Minnesota; and Cynthia Vitale at Penn State University.

  • Data Management Short Course for Scientists – From Earth Science Information Partners (ESIP) in cooperation with NOAA and the Data Conservancy.
  • Data Management Training Clearinghouse – A registry for online learning resources focusing on research data management, hosted by ESIP.
  • DataONE Education Modules – DataONE provides several downloadable lessons in PowerPoint format that can be incorporated into teaching materials. Also available are webinars and screencast tutorials.
  • Research Data Management and Sharing – Coursera offers this five-week, introductory-level course [course started April 6]. Enrollment for is free; and optional certificate of completion is available for a $49 fee.
  • Research Data Management: A Primer – Offered by the National Information Standards Organization (NISO) this primer covers the basics of research data management.
  • Data Management & Curation – The Inter-university Consortium for Political and Social Research (ICPSR), an international consortium of more than 750 academic institutions and research organizations, provides training in data access, curation, and methods of analysis for the social science research community.
  • Guide to Social Science Data Preparation and Archiving – Offered by ICPSR.
  • ETD+ Toolkit – Designed by the Educopia Institute for Graduate Students learning how to manage research for theses and dissertations, but useful to anyone involved in research.
  • MANTRA Research Data Management Training – A free online course from the University of Edinburgh for those who manage digital data as part of their research project. Modules include data protection, rights, and access; sharing and licensing; and metadata and curation.
  • Disciplinary RDM Training – Lists discipline-focused training units by RDMTrain. In addition to MANTRA (see above), units focusing on performing arts; archeology and social anthropology; health studies; and psychology are available. Maintained by the Digital Curation Centre of the U.K.

Image by Jisc, used under Creative Commons license.

Graphic of digital 1's and 0's on a high-tech looking background.

Data management plan resources are available

Screenshot of DMPTool.org. Links at top of page: Learn. Sign In. Title: DMPTool: Build your Data Management Plan. Ribbon: Welcome. Create data management plans that meet institutional and funder requirements. Get started [button]. Below image: DMPTool by the numbers: 28,787 users; 25,051 Plans [More link]; 229 Participating Institutions [More link]. Top 5 templates: NSF-SBE Social, Behavioral, Economic Sciences; DMP Template from DCC; Department of Energy (DOE): Office of Science; Digital Curation Centre; NIH-GEN: Generic. DMPTTool News: New DMPTool launched today [link]. Go to blog. Rss feed icon [link]. Links: About; Terms of use & Privact; Accessibility; GitHub; Contact us. Twitter and RSS feed icon links. Footer: DMPTool logo. DMPTool is a service of the University of California Curation Center of the California Digital Library. Copyright 2010-2018 The Regents of the University of California.

For some time, the NSF has required data management plans, and now the NIH has released a draft policy on making data sets used in NIH-funded research available to other researchers. (Read more about the new NIH policy from ScienceMag.org.)

Thankfully, resources for managing data are available to Miami faculty:

  • DMPTool.org allows you to create, review, and share data management plans that meet institutional and funder requirements.
  • Staff in the Center for Digital Scholarship are available for personalized reviews of data management plans prior to proposal submission.

To get started with DMPTool. org:

  • Navigate to DMPTool.org.
  • Click the big Get Started button in the middle of the screen.
  • Select Miami University (OH) from the drop-down list of institutions on the next page.
  • Click the green Next button.
  • Enter your Miami unique ID and password on the MUNet Login Page.
  • On the next page, click the green Create New DMP button and follow the prompts.

For questions about using DMPTool.org or to arrange a personalized review of your data management plan, contact Eric Johnson, Numeric and Spatial Data Librarian, Center for Digital Scholarship, King Library (513-529-4152).


Data image by By DARPA (Defense Advanced Research Projects Agency, via Wikimedia Commons, public domain.

 

A gone-to-seed dandelion in a green lawn.

A proposal doesn’t have to be perfect to be funded

Three checkboxes on a piece of paper. Each checkbox has, respectively, a smiley, neutral, or sad face next to it.

We recently updated the proposal writing workshop that we’ve been running for years to include more active learning elements. One of the activities we introduced — small group work to analyze sections of sample proposals — resulted in some unintended consequences.

As anyone who has scoured the internet for publicly available sample proposals can attest, they’re really not that easy find, especially if you’re interested in representing a range of disciplines and funding agencies. And hardly anyone posts the ones that did not result in funding, the ones that –to put it bluntly — failed.

So when we distributed sample proposals to workshop participants, they were all proposals that had resulted in awards. But, as participants soon discovered, that did not mean they were flawless. To be sure, there were good things going on in these proposals. But, in comparing these sample proposals to information we had provided about effective proposal writing, workshop attendees also found that the sample proposals fell short in a range of ways. Some failed to provide necessary context. Others lacked a clear needs statement or didn’t manage to articulate explicit connections between goals, objectives, activities, and evaluation. There were inconsistencies between narratives and budgets. There was tortured writing.

My colleagues and co-presenters, Amy Cooper and Anne Schauer, and I have reviewed countless proposals. We take it for granted that a proposal need not be, like Mary Poppins, practically perfect in every way to be funded. Perhaps to our shame, it didn’t occur to us let the workshop participants know that all of the examples we were sharing were successful in securing funding. I have to admit to having been a little gobsmacked then, to receive this request from a participant about halfway into the six-session workshop: “I wonder if it is possible to also [share] some successful proposals/good examples.” That’s what we’d been doing all along.

It turned out to be a teachable moment for both our team and the workshop participants. The lesson we learned as workshop facilitators was that we need to make our assumptions explicit, especially when providing guidance to beginners. The lesson we were able to share with the developing proposal writers is that there is no such thing as a perfect proposal.

The truth is that there are many of factors in grantmaking beyond what appears on the page. For better or worse, politics, professional reputations, and social networks all matter. Sponsors’ hidden values and practices matter. The need to balance research/project portfolios matters. These factors can all help explain why a flawed proposal might be funded, and many of them are beyond the PI’s control.

Our argument — the very reason we even offer a proposal writing workshop in the first place — is that it’s in every PI’s best interest to control the things they can, and that includes the writing. I don’t remember the source, but someone once put it this way: No amount of good writing will save a bad idea, but bad writing can doom a good idea. In analyzing some less-than-ideal — yet funded — proposals, our workshop participants discovered that the system will tolerate some imperfection. That should come as a relief to anyone who will ever submit a proposal.


Written by Heather Beattey Johnston, Associate Director of Research Communications, Office of Research & Innovation, Miami University.

Checkbox photo via Pxfuel. Dandelion photo via Peakpx. Both used under Creative Commons license.

 

Hand tapping one star in a five-star review system.

Editor offers advice for handling a difficult peer review

We’re pleased to reblog this Duke University Press post by guest blogger Courtney Berger. Berger is an executive editor with a university press, so this post focuses on peer review of books under consideration for publication. However, most of her advice applies just as well to peer review of grant applications (just substitute “editor” for “program officer!”).


On a not-too-infrequent basis I see posts and memes in my social media feed denouncing the dastardly deeds of Reviewer #2—that querulous and impossible-to-please peer reviewer. I usually hover over the post, thinking that I might chime in with a bit of helpful advice. I am a book editor after all. Surely I can say something to help alleviate my friend’s experience of feeling misread, misunderstood, or even personally attacked by an anonymous peer reviewer/colleague. But I always resist weighing in, knowing that at that moment my friend just needs to voice their frustration and receive some affirmation. It can be painful to receive this kind of criticism, especially when facing the pressures of tenure and promotion. However, while momentarily painful, even a negative peer review can be a good thing, and you can use the report to strengthen your book. So, here’s a bit of practical and philosophical advice to help you work through a tough peer review.

1) Go ahead and vent—but be careful about where and how you do so.

As I mentioned, I see plenty of social media posts railing against Reviewer #2. No judgment. It’s good to get your community to support you through tough times. But I would caution against offering too much detail in a (semi)public forum or lingering in this phase for too long. It’s a small world—and although there should be an appropriate amount of distance between you and the reviewer, it’s always possible that they are in or adjacent to your social circles. You never know when the person you’ve declared to be the enemy of your book project will turn out to be the person you most wanted feedback from. (Yes, that happens!) After your initial venting, share the report with a trusted friend or colleague and get their feedback. Perhaps they will have a different take on the reader’s comments. They may identify productive advice that it was tough for you to see at first. If it helps, write a scathing response, voicing all of your frustration with the reader’s misapprehensions and misreadings. Get it all out. Then file it away.

2) Focus on problems, not solutions.

My colleague Ken Wissoker touched on this in his blog post on the merits of peer review, and it’s a strategy that I frequently employ to help authors shift their perspective on a review (even a positive one!). It’s easy to get hung up on the reader’s suggestions for how to improve your book. Maybe they recommend adding a chapter or including analysis of a topic or critic that you think is tangential to your project. Or, perhaps you feel like they didn’t “get” your argument or missed a point that’s already in the manuscript. Your job is to figure why the reader is tripping up. If you said something and they missed it, that may not be the reviewer’s fault. Chances are the point is buried at the end of a chapter or not articulated with enough force. In that case, you need to clarify and highlight your claims so that the reader does get it. It’s not uncommon to have two readers—one more positive, the other more critical—pointing to the same issue. It’s just easier to hear the person who presents their comments more constructively. As the author, it’s your job to make the leap and to figure out what your readers need in order to be convinced. Once you do that, it will be much easier to come up with a revision plan.

3) Clarify your vision.

Use the reader’s comments to sharpen your own vision for the book. I often ask authors early in the process: what do you want your book to accomplish? Are you aiming to shift a scholarly conversation, revise an accepted history, offer a new theoretical tool? Do all of the parts of the book support that mission? Clarity on this point will help you to decide which advice to take on board and which to leave by the wayside. The goal of the review process is to help you write the book you want to write, but even better. Let me repeat that, since it’s easy to forget as you’re wading through frustration, self-doubt, or any of the other feelings that this process provokes. You should use the review process to help you realize your vision for the book and to help you say what you want to say in a way that will reach your readers. For a peer-reviewed book, you need to do that in a way that is convincing to other experts in your field; but the book is yours. (Note: I am setting aside exigencies such as tenure review, departmental pressures, and disciplinary policing, which can make this more complicated. But I always urge people to come back to their own ambitions for the project. The audiences and conversations you initiate or enter into with the book are the ones you’ll likely be engaging with for a while, and so they should be ones you care about.)

4) Talk to your editor.

Sometimes a negative review might mean that a press decides to turn down your project, and you may not have an opportunity to get substantial feedback from the editor. But other times, if the reports indicate that the project has great promise, an editor might be eager to work with you to see the book to publication. So process the report, get through the venting phase, and then set up a time to talk to your editor or send them an email with your preliminary thoughts and questions. As the editor, I have a different perspective. First, I know who the readers are, and while I keep their identities anonymous, I can also help an author think critically about the book’s audience and why a particular reviewer might be frustrated with the manuscript in its current state. For example, maybe you thought the book was for a history of science readership. Reviewer #2’s comments might help you to realize that this audience won’t be as receptive to your work. Is this who you are really writing for? If so, you may need to make some adjustments. If not, you may need to reframe the book for the readership you want. Also, I appreciate authors who can take a tough criticism and respond productively. I take it as a good sign when an author is willing to tackle Reviewer #2’s comments and use the feedback to make their book even better.

5) Remember that the review process is part of a larger scholarly conversation.

For many the review process simply feels like a set of hoops to jump through. And it can be that. But it’s also a chance to learn from your peers—just as you would when presenting a paper at a conference—and to respond. While there is the occasional mean-spirited reviewer, most readers are trying to be helpful. Try to receive the comments in the same spirit. Be grateful that someone took the time to read and think with you and take what you can from the conversation.

6) Make your response about you, not the reviewer.

Your editor may ask you to write a response to the reader reports, addressing the readers’ questions and laying out a revision plan. It’s tempting to use this as an opportunity to demonstrate all the ways that Reviewer #2 was wrong. (See #1 above: if you do this, keep it in your drafts folder.) Instead, focus on what you plan to do to improve the book. Now is the time for solutions! For example, if the reader didn’t think the book’s argument was cogent, offer a clear and concise overview of the book’s intervention. If the structure wasn’t working, explain how you will either adapt the structure or make the structure more visible so that the reader will understand it. And hold your ground when you need to. If you really don’t agree with a reviewer’s take on your project, say so and explain how you will make your vision for the project come to life.

6) Know when to cut your losses.

Sometimes a negative review is just a negative review. As difficult as it sounds, you may need to set it aside and move on—to a new press or to a new reviewer, depending on the situation. But hopefully with some of these strategies you can get the most out of the review process, and maybe someday you’ll even be thanking Reviewer #2 in your acknowledgments!


Source: What to Do About Reviewer #2: Advice for Handling a Difficult Peer Review

One-star review image by Gerd Altmann via Pixabay. Reviewer 2 image by KotaYamaguchi via imgflip.

 

Image improperly rendered on digital screen.

eRA Commons glitch prompts reminders about preparing proposals and checking submissions

Order forms and packing box, with text reminding viewer to compare numbers on forms and box to ensure they match. "Check here. And here. Be sure it's right. Check and double check!"

Earlier this fall, there was a glitch in NIH’s electronic submission system electronic submission system, eRA Commons, that caused blank pages to appear in place of content in some grant application submissions.  NIH attributed the error to PDF attachments that were generated from scanned documents, rather than text files.

Although the eRA Commons issue has now been resolved and our proposal facilitators, Anne Schauer and Amy Cooper, did not notice this error happening with any Miami proposals, we thought this was a good opportunity to issue a few reminders about submitting proposals, whether using Miami’s Cayuse system any other submission system.

Reminder 1

It’s always best to generate PDFs from text files created in Word or another word processing program. Using scanned images to create PDFs should be avoided whenever possible.

Reminder 2

It is the PI’s responsibility to ensure that the submission is complete and accurate. Your proposal facilitator reviews your application prior to submission, but because they don’t have expertise in your field, they won’t always recognize when something — a technical figure, for instance — has not rendered properly. You should always check your proposal in the sponsor’s system (e.g., NIH’s eRA Commons) following submission to verify that everything appears the way it should.

Reminder 3

Many sponsors allow a period of time during which a PI may review and “fix” a submitted proposal. For example, NIH allows submissions to be reviewed, withdrawn, and resubmitted in eRA Commons for two days following submission. However, with many sponsors — including NIH — once the submission deadline has passed, no changes may be made to a proposal, even if the allotted review window has not yet passed. This is one of many good reasons not to wait until the last minute to submit a proposal. If you submit at the last minute, there may not be enough time for you to review the submitted proposal, let alone withdraw it, fix it, and resubmit it.


Digital screen glitch image by Rosa Menkman via Wikimedia Commons, used under Creative Commons license. Check and double check image from the National Archives at College Park, public domain.

A teacher listens while a young student reads aloud from a book.

Insights about applying to Spencer Foundation emerge from program officer visit

Roey Ahram, Spencer Foundation Associate Program Officer, visited Miami on October 3.

At the invitation of Dean Michael Dantley, Spencer Foundation’s Associate Program Officer, Roey Ahram, spent October 3 with Miami’s College of Education, Health, and Society. Ahram gave a detailed presentation in the morning and met with individual faculty members to discuss their research throughout the afternoon.

Ahram explained that Spencer is interested in education broadly defined and wherever learning occurs from birth through adulthood. They are responsive to education researchers’ needs, and as Ahram explained it they, “fund whatever the field thinks we should fund.”

Ahram described the Foundation’s areas of interest, including creating and sustaining equitable education spaces, emphasizing the foundation’s view that “learning is a social justice process.” Another area of interest is innovative research approaches. Ahram explained that Spencer funds across the full range of educational research approaches and that the Foundation believes more research on the methods themselves is needed. Other areas of interest include learning and flourishing and high-quality teaching and leaders.

The Spencer Foundation funds three types of grants:

  • Field-building activities (including conferences and mentoring)
  • Training fellowships for dissertations and post-doctoral work
  • Tenure-track faculty, and field-initiated research

A great deal of Ahram’s presentation focused on field-initiated research, which would likely be of most interest to Miami faculty. Please refer to the Spencer website for more details.

As Ahram explained, the Spencer Foundation is deeply interested in reflecting the diversity of educational researchers, as well as learners and educators, in the proposals they fund. He indicated that, historically, Spencer has supported primarily West Coast, Northeast, and large R1 Midwestern universities and colleges. They specifically want to diversify their geographic reach, which may make this an ideal time for Miami faculty to consider applying. If you’re interested in submitting a proposal to the Spencer Foundation, contact Amy Cooper to get started. Staff from University Advancement are available to help with relationship building and even writing and editing.

Ahram offered the following four tips for grant-seekers:

  • Start with a question.
  • Know your audience. Reviewers for Spencer research grant proposals include, at minimum, a subject expert, a research method expert, and a generalist.
  • Align the sections of your proposal. Ahram suggests specifically stating in the research methods section, “I’m answering my research question by . . .”
  • Learn from feedback. For full proposals, Spencer provides detailed reviewer comments. While Spencer grants are becoming ever more competitive — with some having as low as a 5% funding rate — resubmissions are faring better and better.

What makes a Spencer application stand out? Ahram says well-done literature review and methodology sections are critical, with a research question clearly driving the research. He emphasized that the methodology should include a well-articulated analysis plan, joking that “you’ll need more than T-tests!” Applicants should also justify their choice of analysis method with citations. See the “Resources and Tools for Applicants” section of the Spencer website for details.

Spencer program officers are also education researchers themselves and Ahram’s area of interest is special education. He stated that program officers are happy to discuss project ideas with applicants, but aren’t permitted to review proposal drafts. Finally, Spencer is looking for diversity in proposal reviewers; contact a program officer to volunteer.


Written by Amy Cooper, Assistant Director of Proposal Development, Research and Innovation; Brian Furnish, Assistant Vice President, Corporate and Foundation Relations; and Carrie Powell, Director of Development for the College of Education, Health, and Society, Miami University.

Photo of Roey Ahram by the Spencer Foundation. Photo of Urban Leadership Intern Madison Cook with students by Jeff Sabo, Miami University Photo Services.

Two children display a fish they caught, which is still on the line on their fishing pole.

From the archive: Enhance NSF MRI applications with these insights

A man crouches behind a little boy, showing him how to use the fishing pole he holds.
Like teaching the next generation to fish, training the next generation of instrument users and developers is critical to sustainability.

The Major Research Instrumentation (MRI) program offered by the National Science Foundation (NSF) “provides organizations with opportunities to acquire major instrumentation that supports the research and research training goals of the organization that may be used by other researchers regionally or nationally.”

MRI is a limited submission opportunity, meaning that the number of proposals submitted from a given institution is limited by the NSF. To determine which proposals Miami University will submit each year, OARS conducts a review of preliminary proposals. For the 2020 MRI competition, the window to submit to the NSF is January 1-21, 2020, but the deadline to submit preliminary proposals to OARS is October 28, 2019. With that date coming up, we thought we would re-run a post from 2017 that shares some insights about applying to the program.


INSIGHT 1: Get the basics right.

Be sure to read the solicitation carefully, even if you’ve applied in (multiple) previous years. Solicitations for longstanding programs do change from time to time, so it’s important to read each new solicitation. In fact, the institutional submission limits changed with the 2018 solicitation. Rather than submission limits being based on acquisition or development, they are now based on amount of funding requested. Institutions may submit up to two proposals with funding requests between $100,000 and $999,999 and one proposal with a funding request between $1 million and $4 million, inclusive.

At the NSF Spring Grants Conference held Louisville in June 2017, Randy Phelps, the NSF staff associate who coordinates the MRI program, suggested the following points are especially important to note:

  • The program funds equipment for shared use, so the proposal must demonstrate use by at least two personnel. There can be up to four co-PIs on the project, but there can be more users than PIs.
  • The project period can be up to three years because the program will fund operation and maintenance of the instrument for that length of time.
  • Make sure that what you’re requesting is eligible for funding under the MRI program. In general, the program will not fund anything that can be re-purposed for non-scientific use after the end of the project period. Specific details about what can and cannot be requested can be found in the NSF MRI FAQs.
  • Remember that voluntary committed cost share is prohibited. While MRI requires that institutions share 30% of the total project costs, NSF does not allow institutions to volunteer to share costs over and above that mark. This prohibition extends to reduced indirect cost rates.

Mike Robinson and Paul James, members of Miami University’s Department of Biology, attribute much of their success in securing an award in the 2017 MRI competition to their recognition of Phelps’ first point.

“What was key for us was that we hit a broad swath of people and types of research,” Robinson says. “We included faculty working in developmental biology, physiology, ecology, physics, and engineering.”

Their proposal included Robinson as PI, four co-PIs (including James), and seven additional equipment users as senior personnel.

INSIGHT 2: Tell a story that resonates with reviewers.

“Get the instrument and they will come” is not a compelling story, Phelps said. Instead, he urged proposers to demonstrate that the science is driving the request for the instrument. There’s lots of advice out there (here, here, and here, for instance) for scientists who want to become more persuasive storytellers. In addition, Phelps offered this specific advice for MRI proposals:

  • Make sure that the format of your proposal emphasizes the science, rather than the instrument.
  • Consider grouping users into categories by type of use and organizing the proposal around these categories. Break down the use of the instrument by group, identifying the percentage of total use each group will account for. Demonstrate, for example, that Group A’s use will account for 60% of total use; Group B’s use will account for 20% of total use, Group C’s use will account for 15%, and Group D’s will account for 5%. Then explain how each group’s use correlates to a corresponding percentage of the instrument costs. In this example, that means that since Group A will account for 60% of the instrument’s total, the proposal should show that 60% of the instrument costs derive from the capabilities Group A users require.
  • Show that the instrument will be used — a lot. The less downtime you can project, the better your proposal will fare in review.

Robinson recalls that when he and James first decided to write the MRI proposal, conversations with colleagues were less than encouraging.

“I can’t tell you the number of people that told me there was no way we were going to get this award,” Robinson says. “We had all of these things going against us: We were going to have to work on the proposal over the holidays; neither Paul nor I had used the equipment; we were told we were going to have to have preliminary data on that very piece of equipment, which we certainly didn’t have; and they kept talking about broader impacts and how there was no way we could satisfy the NSF with that.”

But Robinson and James forged ahead, with the help of an external consultant provided by OARS.  Consistent with Phelps’ second recommendation, they organized their proposal around three types of use, or “themes.” Each of these themes incorporated the work of at least two of the proposal’s co-PIs or senior personnel, and Robinson and James worked hard to weave each researcher’s individual descriptions of their work into a coherent overall narrative. The end result was a story that clearly resonated with the program’s reviewers.

INSIGHT 3: Research training is a critical component of an MRI proposal.

Give a someone a fish and they’ll eat for a day. Teach them to fish and they’ll eat for a lifetime. That old adage encapsulates NSF’s perspective on research instrumentation. Not only do they want to get instruments in labs to facilitate research today, but they also want to help create the next generation of instrument users and/or instrument developers.

“If a proposal does not describe research training — particularly for underrepresented groups — it will fail during review,” Phelps said.

The research training plan must be concrete, feasible, and able to be evaluated. Outreach — especially to K-12 students — is not fundable through MRI, and simply providing undergraduate training is not enough.

“All proposals will include [undergraduate training],” Phelps said. “What makes your institution stand out?”

Robinson and James’ proposal made clear that all of the undergraduate and graduate students work in the labs of the project’s PI, co-PIs, and key personnel will receive training to use the fluorescence activated cell sorter (FACS) system that will be acquired with the NSF grant funds. Professional technicians working in the labs and in Miami’s Center for Bioinformatics and Functional Genomics (CBFG), where the FACS system will be housed, will also receive training. In addition, Robinson says his team “took the broader impact stuff very, very seriously.” So while there are no funds in the grant to support outreach activities, they will nevertheless incorporate FACS-related material into a range of activities that will be shared with K-12 students through STEM outreach initiatives of Miami’s Hefner Museum of Natural History.

INSIGHT 4: Treat the required Management Plan with as much care as you do the rest of the proposal.

Phelps pointed out that good scientists are not always good managers. So, he said, it’s important to reassure the reviewers that the project team is capable of competently managing the acquisition of the instrument, the operations of the instrument, the scheduling of user time, and the strategic use of downtime. For Robinson and James, these issues were resolved by involving the CBFG, whose staff has an extensive track record of managing instruments and coordinating user time.

INSIGHT 5: You probably need a Data Management Plan, even if you think you don’t.

It may not seem intuitive, but Phelps said he considers a Data Management Plan crucial for most MRI proposals. Acquisition is the perfect time to think about how to enable metadata and manage storage of the data generated by use of the instrument. If you can demonstrate a plan for facilitating the dissemination and sharing the results of all the research that will eventually be conducted using the instrument, you give the reviewers one more reason to fund your proposal.


Written by Heather Beattey Johnston, Associate Director of Research Communications, Office for the Advancement of Research and Scholarship, Miami University.

Photos by Kemberly Groue, U.S. Air Force, public domain.