The Open IGERT: Review of the Reviews – Grant Declined

Sorry for the weird title, I wanted to describe as much about this post as possible in the title without making it super huge. Long story short, the IGERT Grant that my peers and I submitted in August has been reviewed and the reviews are in.

The Open IGERT has been declined.

I’m not at all surprised. I knew we were a long shot. And I knew that we weren’t focusing on issues that, while not required or mentioned by the NSF, would be things they would want us to focus on. But, I thought we put together a powerful program. Unfortunately the NSF and their army of anonymous peer reviewers thought otherwise.

So in an effort to improve upon the program I am going to share the reviews with you all, and comment on the reviews. The reviews and the review summary can be found in my Google Drive Folder. Feel free to poke around. For reference, here is the original NSF call for proposals. And so you can skip the link, here is the main objectives of the IGERT call:

  • …NSF recognizes the need to educate and support a next generation of researchers able to address fundamental challenges in
    1. core techniques and technologies for advancing big data science and engineering;
    2. analyzing and dealing with challenging computational and data enabled science and engineering (CDS&E) problems, and
    3. researching, providing, and using the cyberinfrastructure that makes cutting-edge CDS&E research possible in any and all disciplines.

On to the reviews:

Review 1

  • What is the intellectual merit of the proposed activity?
    • “The basic notion that scientists and engineers increasingly need to be aware of the data life cycle and the principles and practice of data stewardship is hard to dispute but it does not in and of itself seem to provide a sufficient vision for an IGERT program.”
      • I’m very happy that this reviewer agrees with that the focus of the IGERT – to train students about the importance of data management – is important. I’m sad that they do not see the education of data management as a valuable IGERT program.
    • “The proposal does not seem to have any disciplinary focus that would ground these issues in pressing problems and make the argument that a specific area or set of areas is in particular need of training in data management.”
      • The argument of our proposal was that all disciplines would benefit from having an understanding of the importance of data management. Thus the nature of the proposal is interdisciplinary by definition, which is in agreement with NSF IGERT values. The fact that this reviewer wants a specific focus on some discipline is contradictory and confusing with regards to the objectives of the IGERT program.
      • Also the grounds for these issues was centered around open science. The need for data management tools in an open environment is imperative. And training students to develop and understand the tools would be the focus of the program. Just because that focus isn’t rooted in Physics, Computer Science, or Biology, etc is seemingly close-minded for an institution that is looking for innovative educational programs.
      • Perhaps it is our fault (the writers) that these points were missed because we chose to root the issues in a specific discipline like “computational and data enabled science and engineering” as the IGERT instructs.
    • “The proposal states that it will target students from STEM departments, but that covers a wide range of areas and a wide range of cultures with disparate needs and issues… A civil engineer’s interests in data management would be very different from someone who was working in high energy particle physics. Moreover their backgrounds would be radically different which would probably necessitate very different educational programs.”
      • The first sentence seems to properly illustrate the goal of all IGERT programs – bringing students from various backgrounds together. But for some reason it is written like that is a problem with our proposal.
      • They also do a very good job pointing out the facts of different disciplines and different educational programs. Sorry for the sarcasm. But in all seriousness, a civil engineer and a particle physicist do have different data management needs. And that is what we wanted to highlight as the focus of our program. A lot of the extracurricular aspects of the program was to work with the students to develop an institutional repository to meet all of their needs, and to meet the needs of all the university. Perhaps it was not clearly written, which would be a valid complaint.
    • “While one can make the argument that all scientists and engineers are, in one way or another, responsible for collecting and disseminating data. This general statement masks serious differences in what exactly that means to different communities.”
      • I actually agree with this. But how else are we supposed to describe the issues and fit the rest of the program within the proposal submission guidelines? This is something to focus on and work on in the next iteration.
  • What are the broader impacts of the proposed activity?
    • “While the courses seem perfectly reasonable in and of themselves the lack of a disciplinary focus seems to diminish their appeal. From the perspective of a putative student in a discipline like Biology it seems perverse to require 3 courses on generic topics in data management when that time could be spent  studying the genomic databases or medical databases that may be directly relevant to their studies. One could argue that, from the perspective of the student, it would be more profitable to proceed from the study of specific data management and analysis issues to more generic ones rather than the reverse.”
      • I totally agree with most of this critique. But disagree in one important context. All IGERT programs require students from various disciplines to learn techniques in some broad, loosely useful program. I have been involved in two IGERT programs where the classes were a huge waste of my time. So I agree that subjecting students to 3 courses on generic topics is perverse. But these classes aren’t intended to be generic. As teachers it is our responsibility to tailor the courses to meet the needs of the students. And I feel that failure to recognize that is a misstep in reviewing, but maybe I’m off base with that.
      • Again it is our responsibility to make sure the reviewers understand what we are intending, and in this regard we failed.
    • “Here again there seems to be an opportunity to craft a program of study that would appeal to a particular set of students and give them the tools and perspectives they would need to excel in the era of data driven science.”
      • This comment makes sense, and it is in this statement that I understand it would probably have been more to our benefit of securing the grant if we had focused on something specific. Like data management of biosciences and incorporate various disciplines that could use that (biology, biomedical engineering, computer science, biophysics, etc). Damn me and my lofty goals!
  • Summary Statement
    • “This proposal identifies the important issue of data stewardship and argues that graduate students should be trained to use modern tools and techniques to collect, analyze and disseminate the data that they collect. This is indubitably true how ever the proposal would benefit from a more focused vision which would target specific problems or communities who could benefit from adopting better data management practices. Such a vision could inform the choice of a specific set of topics that the educational program would address and motivate a research program which would seek to ameliorate data challenges in that area.”
      • This is actually a really good summary that could help develop a better program.

It saddens me that tackling a specific problem is what the NSF is looking. Big data issues and data management are new issues in the scientific world. As such I’ve never heard of a situation where education in a broad sense is not valuable. Currently there is NO education of any of these issues and instead of picking a single field to increase education, we are trying to educate every field as a whole. As the program develops we would hope to tailor that education to meet specific needs, but we can’t know what those needs are until we (the world) knows what problems exist.

The Library of Congress, didn’t try to understand how online scientific information is being obtained, disseminated, used, etc in a specific field. They tried to get a broad overview of the situation so they could assess and tackle.

Review 2

  • What is the intellectual merit of the proposed activity?
    • “This proposal aims to transform the research culture from a closed research model to one where products of research are public goods. If successful, this training model may have broad impact at other institutions. The research topics includes a data life cycle industry approach to research projects with a focus on educating student on how to manage trend of open access science. The research efforts are tightly aligned with the proposer’s themes of data, collaboration and openness; however more specific research topics for IGERT fellows are not provided to assess the relatedness/innovation to the open and emerging challenges within the fields.
      • This seems to be aligned with the previous reviewers comments. It’s alarming to me that most modern scientists feel that science is so specific that the broad, simple questions don’t really have a place. Why the need to always answer a very specific question? Especially in an emerging field.
      • Regardless, the main commentary is noted. Next time focus on a more specific topic and work that area.
    • While communicating the current research efforts and open science endeavors is an important aspect, the primary communication medium, a blog, may not be the appropriate choice for the intended audience.
      • Why was ONE of our methods of outreach the focus of this reviewers review?
      • I want to make the broad declaration that this reviewer doesn’t see the value in modern communication methods, and probably doesn’t value modern outreach methods and models. I hope this isn’t true and would love to talk to Reviewer 2 about this. I have an army of online scientists who would beg to differ.
    • The course description, objectives and expected student outcomes are helpful in understanding the course content to be offered, but the intended placement of these courses in their degree program is not clear. These courses appear to be established courses in the curriculum.”
      • This is a very valid criticism. In our discussions with developing the curriculum we had talked about providing a minor in Library Science (or something to that effect), and even incorporating these classes in the OLIT program. After reviewing the grant, these discussions didn’t make it into the proposal. Our bad…
  • What are the broader impacts of the proposed activity?
    • “The program evaluation and assessment makes use of an established external evaluation team but the reviewer has a concern that the program does not have research and educational objectives and expected outcomes.
      • The external evaluation aspect of our proposal is the weakest part of the proposal. So these comments are well received and I actually agree with them.
      • We actually read some other IGERT applications (both successful and unsuccessful proposals) and modeled ours after those. In none of those cases was there a real clear outline of the evaluation process. We thought we could get away with that, and we were wrong.
    • The proposers have outlined an extensive list of UNM student organizations that will be contacted in their outreach acclivities without a plan of making these activities an integral part of the IGERT program. Given that a Co-PI is well-established with the STARS Alliance, the STARS Alliance may be better included in the broadening participation activities. The mentoring and retention initiatives could be enhanced to address specific issues, goals and objectives related to student needs.
      • Perhaps it wasn’t clear how the UNM Student Organizations would be incorporated into our program. That is very valid. I didn’t feel it was worthwhile explaining how students would use existing infrastructure for support, being that I would explain that to the students themselves in my mentoring role.
      • Again, specific needs being addressed could have helped the proposal.
      • I have no idea what the STARS Alliance is. This is probably a lack of communication on our part, which is interesting because I’m pretty sure I read the final full proposal. Maybe something was included last minute that I missed?
    • The proposed program includes the option for study abroad program in partnership with UTPL while the figshare opportunity for travel to England could be better clarified in terms of research directions and educational training options.
      • Again valid points. This is also my fault, as I assumed (wrongly) that specific direction and training would be assigned at the time of placing the students.And it would also be decided how the students would best benefit from those opportunities. Given that every internship opportunity that I’ve ever been apart of, or witnessed, has been decided almost spur of the moment I don’t think this is something that should have really been against us. Am I incorrect in that viewpoint?
      • Neither abroad aspect was really defined specifically, so why they chose to focus on figshare over UTPL is interesting.
    • The recruitment and retention information provides percentages while raw numbers may be more beneficial in understanding the possible impact.”
      • OK.
  • Summary Statement
    • Section missing – Incomplete review?

I suppose the lack of a summary statement is not significant, but it is to me. Although it could be perceived that this 2 paragraph review of a 25 page grant is summarizing enough. I do have to say (sarcastically so) that is some great constructive criticism! Sarcasm aside, there are some constructive comments that are very useful. And I suppose they did give us a rating of “good,” so maybe I should shut up. Still it would have been nice to know what exactly was good about our grant.

Review 3

Sweet some real insight!

  • What is the intellectual merit of the proposed activity?
    • I’m not going to quote, but the entire first paragraph feels like a lecture and a difference of opinion rather than a critique of the proposed program. “But having new skills will not change a culture.” Exactly how will a culture change without education? That statement to me is the thing that sticks out the most in that paragraph.
    • “The team of the five investigators and three senior personnel is mostly composed of library-related experts, none of them of outstanding stature as a high-profile researcher. H-values of up to 9 show a modest success with traditional peer-reviewed publications. This lack of scholarship is also evident in the list of references, which comprise a total of five citations.”
      • It’s good to know that one of the reviewers of an innovative program is actually really ingrained in useless and outdated career measuring systems. See I can share personal opinion in a review as well.
      • It’s also good to know that a reviewer in a cyberinfrastructure proposal doesn’t value modern library scientists.
      • Exactly what is your h-value? Show me your credentials, since you’ve seen mine. Sorry, I don’t like anonymity.
    • “The core courses, already in place, all relate to issues of data stewardship in an open environment, but they have a very strong library perspective. In theory students could already obtain the desired exposure through existing avenues… The education plan does not explicitly relate to the IGERT fellows’ doctoral research, as the data stewardship education is seen as complementary to, not integral of, each student’s dissertation research. The fact that students will pick only one from among the 17 electives will not expose them in couse work to many other disciplines than their home.”
      • We chose the library perspective intentionally. Is library science not viewed as a valid scientific discipline? Especially since the librarians included in the proposal all have “hard science” background.
      • I understand that data stewardship can be viewed as a complementary skill. But it is also a skill that is central to a dissertation. How can you handle data if you don’t have the educational background. Maybe we forgot to mention that the students we recruit will incorporate these skills into their dissertation research.
      • While the view that one elective is not enough exposure to other disciplines is valid. We did include other activities in the program that provided the appropriate exposure to other disciplines. Plus the core courses were designed with interdisciplinary projects so students from different disciplines would work together.
    • “The internships w ith industry leaders in open access tools (documented with letters of support from PLOS and BenchFly) are a fine complement, but since there are few companies with such a focus, it is unclear whether all IGERT fellows will obtain places.”
      • Don’t forget figshare…
      • I suppose the concern is important. But I’m pretty sure the budget clarifies this. Either way, I would need to look into improving this section, because to me this was the best part of the program.
    • “The most innovative aspect of the education plan is the Open Research Challenge, which includes IGERT fellows will lead other students in the development of a novel open-research tool. The specifics, beyond a proposal process and the linkage of the proposal to the first core course, are sparse. In particular no information is provided how the allocated $50,000 will be managed and who will be in charge of leading beyond the proposal process.”
      • This is also a valid criticism. The hope was that the idea would be enough to show the intent and then we would develop the plan as the program began. We also figured that we could let the students dictate how to moderate the challenge with their ideas, instead of set limits that would stifle their creativity.
  • What are the broader impacts of the proposed activity?
    • In this section there is a lot of good criticism about the IGERT, both positive and negative feedback that can be read constructively. I especially appreciate pointing out that the flaws of the program: the evaluation aspect being the main focus.
    • “The documented track-record of completed PhD supervision by the IGERT team is very mixed, as only two participants list numbers that add up to 5 PhDs in the last five years.”
      • This may be a very valid negative of the program. Most of the PI’s listed were young faculty. But it was my belief that we would recruit students from strong advisers and guide them through our program. The intention was never to populate our program with our own students. Regardless, I can see how reviewers wouldn’t have faith in a mentoring program if the mentors have never graduated students.
    • “The theme of the proposal aims explicitly at the development of professional skills.”
      • Is this a problem of the proposal? While we never specifically say that the program is professionally oriented, I can see how the industry focus for internships is the motor to make that claim. But the intent there is to get a well-rounded education. And we already have enough PhD’s trying to get back into academia. Let’s spread the wealth a little.
  • Summary Statement
    • “Overall, the proposal rates low on intellectual merit, because the theme chosen is at odds with the quest for coupling emerging research with
      interdisciplinary education, and the IGERT team lacks stature. The proposal rates, however, high on broader impacts, particularly diversity. In
      summary, this is a fair proposal.”

      • I knew we would rank strongly for diversity and our broader impacts. I expected more from the educational component.

Overall a pretty decent review. At least they took the time to write constructive responses. Although I’m a little saddened that traditional thinkers are put in charge of reviewing innovative programs. Most of the criticism is completely valid, but the focus on the high-impact stature of the team is a little much. I REALLY want to prove this person wrong now. #fueltothefire (sorry, couldn’t help it.)

Review 4

This person provided a 3 sentence review. When it comes to peer-review of publications a couple sentence response shouldn’t be accepted, and when that is the response to a multi-million dollar program it should be rejected. I have zero respect for this person. Thank you kindly-cowardly anonymous reviewer. Your critique has not been noted and I will not even rebut your remarks. Thanks for spending 15 seconds to offer feedback on the program I spent months working on, and you spent 5 minutes reading.

Review Summary

I will avoid posting line-by-line critique of the review summary. But I will respond to a few things:

  • Overall there was concern that there was no mention of innovative education. I retort that there was a lot. The internship opportunities were top-notch. The classes that we structured were going to be collaborative based learning environments, completely different from standard lecture classes. The development of a collaborative workspace was to provide students a place to interact and learn from each other. And the mentoring would be on another level. As a former IGERT student, I haven’t EVER witnessed innovative education in those other programs that was different than my normal curriculum. I even wrote a section in the grant that explained how this IGERT would build upon and improve other IGERTs held by UNM.
  • It was felt that there was a lack of interdisciplinary connection. In every other program, the IGERT just puts different students in the same classroom. Our program would put them in the same learning environment and essentially force them to use their different skills to work together. If that isn’t the nature of interdisciplinary and collaborative research then I have no idea what those terms mean.
  • They say the mere training of IT was subpar for cyberinfrastructure research. Sure but we aren’t training students how to use a computer. We are training them on how to develop cyberinfrastructure in their home labs and how that infrastructure can be developed to further research and how new fields of research are emerging from the development of cyberinfrastructure. Somehow that point didn’t come across.

It must be noted that this is my first opportunity for peer review. And I have to say, it did not disappoint…

…in the regard that the reviews took a while, there was an entire review that was THREE sentences long, the peer-reviewers were pretty close minded about certain aspects, etc etc.

There was a lot of valid criticism. Most of which would be helpful if it was elaborated on. It does give me the motivation to rewrite the grant and submit it to other organizations who may be more open-minded about the process. Also I should contact the leaders of the Open Data IGERT, to see their grant and ask how their program is run, etc, etc.

The positive of this experience is that I’m not broken down and I want to pursue this educational program even without NSF funding. I think it could be really beneficial for the future of science and is more than likely a necessary step for the future of networked science.