Sunday, June 19, 2016

2016 DOE-AMR Vehicle Technologies

Last week I had the opportunity to participate as a reviewer for the Department of Energy (DOE) Annual Merit Review (AMR) in Vehicle Technologies, specifically propulsion materials. This past Friday, I finally submitted the two reviews assigned to me. 


With a combination of procrastinating and not being 100% efficient, two single reviews took me the entire afternoon. In my sample size of two, the second one was "easier" than the first one. I was talking to my lab director, and I mentioned that I did not think two reviews would take me so long.

The overall experience was interesting. The presentations that I attended were allotted 30 minutes each, with the presentation to be about 20 minutes and 10 minutes for questions and answers, with priority given to the reviewers. Unsurprisingly, some presenters still clearly went over the the 20 minute mark, like in any other conference. Although this was not like any other conference. For one, there was no conference fee. Secondly, the contents of the presentation were organized in a way that made the review process challenging for me as a first-timer.

Reviews needed to cover their approach, their technical content, their collaborations, and their future work. Often collaborations are quickly glanced over and rated on a scale of 1-4 but almost seemed to be more of a check-box. The approach was where the objective of the project was generalized such that anybody could understand, and since most of these talks dealt with Integrated Computational Materials Engineering (ICME), typically presented a flow diagram as well. The technical content focused on what was achieved that year, and finally the future work covered what will be done for the next year, which at times seemed relevant to the technical content presented or other times seemed out of place before you realized you needed to look at the big picture.

Therein lied the challenge for me being a first-time review. I was reviewing projects that were both somewhere half-way complete, so in my head I couldn't see the complete picture at times of what had been accomplished before. And even in the technical content, it was a quick overview glance at the most major achievements, without digging deeper into the science at times. This made the merit of the future work even harder to judge. Lastly, I approach science with mostly a positive mindset (until I learn more and become a pessimist), so I think all approaches are generally valid and very cool. Then to score and comment each one of these categories on an arbitrary scale from 1-4...

Reviewing a project based a 20 minute presentation was unlike a reviewing a paper where you did have all the details. And as a scientist, details are important to us. I started making the effort to look up papers had been published, but most of this could not be found since it was all recent work. It took me awhile before I finally conceded and took a step back and judge the review based on what was presented to me (as well as last year's presentation. I realized I had to judge the project, and not necessarily the science (although it is a large part of it). I gave my input on the things I was impressed with, but also on what I thought could be use some more work or consideration. I hope in the future if I do this again, I'll also be able to give input on what I found was lacking (which comes with years of knowledge in the field...)

It was an experience to sit on the other side and be presented to, rather than making the presentations for my advisor as I had once done during graduate school. 

No comments:

Post a Comment