Whether you are running the benchmark yourself or the computer vendor is, it is a good idea to set up a benchmark report template as part of the planning step. With a format to follow, consistent information is reported, which eases comparison. Make an annotated spreadsheet to fill in. What should be included in that template?
There should be an itemization of hardware and software configurations including versions actually used. Yes, you specified this earlier (see "Software"), but what if the computer vendor used a different version of Oracle or LS-DYNA? There are often feature and performance differences between versions and sometimes they just do not compare.
What metrics did you decide to use for each component of the benchmark? Is it CPU seconds, elapsed time reported in hours, minutes, and seconds, or is it number of runs completed per hour to name just a few? Identify each component in the template and spell out the metric clearly. Avoid metrics that are specific to one or a few vendors; stick to commonly available metrics.
Double-check that the template matches the rest of the document. There is nothing more confusing than one part of the rules stating one thing and another part contradicting that. Which is right? It leads to questions from every computer vendor.
The benchmark expert may have made changes to your scripts, programs, or setups as part of porting or optimization. Request that any modifications be included as part of the report. This could be in the form of the UNIX diff command or just a commentary. You decide what works best for you.
Are there any files that you would like back or to take with you perhaps to recheck the results later? Identify them and the media that you will accept. CDs and 4mm DDS4 tapes are the current technology. Maybe even ftp to an anonymous FTP server will suffice. Give that information. Avoid requesting massive amounts of data to be returned unless necessary for your verification of correct results.