40
Assessor Handbook A guide for both General and Detailed Assessors on the selection process and assessing Proposals under the Discovery Program and Linkage Program schemes for LP18, SR18, DE19, DP19, IN19 and LE19. Version 4. Release date 7 March 2018

Assessor Handbook Version 4.0 - arc.gov.au › Comments that align closely with ratings  · Web viewsupport the initiation and/or development of long-term strategic ... encourage

  • Upload
    hadang

  • View
    220

  • Download
    0

Embed Size (px)

Citation preview

Assessor Handbook

A guide for both General and Detailed Assessors on the selection process and assessing Proposals under the Discovery Program and Linkage Program schemes for

LP18, SR18, DE19, DP19, IN19 and LE19.

Version 4. Release date 7 March 2018

Contents

1. Overview.................................................................................................................................................................. 3

2. The Assessment Process........................................................................................................................................ 3

2.1 General Assessors............................................................................................................................................. 3

2.2 Detailed Assessors............................................................................................................................................ 5

2.3 Rating and Ranking Assessments..................................................................................................................... 7

2.4 Important Factors To Consider When Assessing...............................................................................................7

3. General Assessors: Selection Advisory Committee (SAC) Meeting Preparation.....................................................8

3.1 Roles and Responsibilities before the SAC Meeting..........................................................................................8

3.2 Roles and Responsibilities at the SAC Meeting.................................................................................................9

3.3 Attendance Arrangements............................................................................................................................... 10

4. Ensuring Integrity of Process................................................................................................................................. 11

4.1 Confidentiality and conflict of interest (COI).....................................................................................................11

4.2 Research integrity and research misconduct...................................................................................................11

4.3 Proposals outside an assessor’s area of expertise..........................................................................................11

4.4 Eligibility........................................................................................................................................................... 11

4.5 Unconscious bias............................................................................................................................................. 11

5. Questions during the Assessment Process............................................................................................................12

Appendix 1: Discovery Program Rating Scale and Selection Criteria Considerations................................................13

Discovery Early Career Researcher Award (DE19)...............................................................................................14

Discovery Indigenous (IN19).................................................................................................................................. 16

Discovery Projects (DP19)..................................................................................................................................... 19

Appendix 2 Linkage Program Rating Scale and Selection Criteria Considerations....................................................21

Linkage Projects (LP18)......................................................................................................................................... 22

Linkage Infrastructure, Equipment and Facilities (LE19)........................................................................................25

Special Research Initiative: PFAS Remediation Research Program (SR18).........................................................28

Glossary..................................................................................................................................................................... 31

Frequently Asked Questions...................................................................................................................................... 33

Updated sections in Assessor Handbook...................................................................................................................35

2Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

1. OverviewThis Handbook provides instructions and advice for both Detailed and General Assessors on the Assessment process for most schemes under the Australian Research Council’s (ARC) National Competitive Grants Program (NCGP). The NCGP supports the highest-quality fundamental and applied research and research training under two funding programs, Discovery and Linkage.

The specific Objectives and Selection Criteria for each of the schemes covered in the Handbook are listed in Appendices 1 and 2, and are also available in the Funding Rules on the ARC website.

This handbook covers assessment for Linkage Projects 2018 (LP18), Special Research Initiative: PFAS (per- and poly-fluoroalkyl substances) Remediation Research Program (SR18), Discovery Early Career Researcher Award 2019 (DE19), Discovery Projects 2019 (DP19), Discovery Indigenous 2019 (IN19), and Linkage Infrastructure, Equipment and Facilities 2019 (LE19). For Linkage Projects 2017 (LP17), please refer to the Assessor Handbook Version 2 on the ARC website.

This handbook does not cover the assessment of Proposals under the Australian Laureate Fellowships, Future Fellowships, Industrial Transformation Training Centres, Industrial Transformation Research Hubs, ARC Centres of Excellence, Learned Academies Special Projects, Supporting Responses to Commonwealth Science Council Priorities or Special Research Initiatives (other than SR18) schemes.

2. The Assessment ProcessPeer review plays a critical role in the assessment of ARC Proposals and is undertaken by two groups of experts known as General and Detailed Assessors. Experts from each group assess Proposals against the relevant scheme selection criteria and contribute to the process of scoring and ranking research Proposals. The objective of the assessment process is to ensure that the highest quality research Proposals are recommended to the ARC CEO for funding. The CEO then makes recommendations to the relevant Minister who decides which Projects will be allocated funding under the NCGP.

The Research Management System (RMS) is the web-based computer system available for the preparation and submission of research Proposals, assessments and rejoinders for the ARC. The RMS Handbook for Assessors, a guide for General and Detailed Assessors to navigate the RMS assignment and assessment process is available on the ARC website.

General and Detailed Assessors have different roles in the peer review process. Key aspects of their roles are outlined in Sections 2.1 and 2.2 respectively.

2.1 General Assessors

The Selection Advisory CommitteeFor each funding scheme round, General Assessors are selected to form a Selection Advisory Committee (SAC) to oversee the peer review process. The General Assessors are chosen to ensure relevant expertise based on the requirements of the scheme. All SACs will include members from the ARC College of Experts (CoE) and may include eminent members of the wider academic community and/or key industry groups. SACs may also be divided into panels of different disciplines.

Following the deadline for submission of Proposals, Executive Directors at the ARC assign each Proposal to General Assessors. The lead General Assessor (Carriage 1) is usually closely associated with the Proposal’s academic field and other General Assessor(s) (Other Carriage or Carriage 2, 3 or 4) have supplementary expertise. Carriage 1 has primary responsibility for the Proposal, which will include speaking to the Proposal and its assessments and rejoinder at the SAC meeting.

Detailed Assessors are assigned by either the Carriage 1 or an Executive Director at the ARC. There are different requirements for the number of Detailed Assessors to be assigned for each scheme. Table 1 below provides an overview of the SAC format and assignment requirements for each of the schemes.

The Carriage 1 is asked to select assessors across different institutions and a variety of organisations, nationally and internationally to achieve a balanced evaluation of the Proposal. Particular attention is needed to ensure that multiple assessors from the same department are not assigned and that the gender balance of Detailed Assessors is taken into account. If the assigned Detailed Assessors and reserves become unavailable, an ARC Executive Director will assign additional Detailed Assessors.

3Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Please note, after assigning the required number of assessors in RMS and following the ARC’s announcement of assignments, the Carriage 1 may notice that some Proposals appear to need more assignments. This is due to the previously assigned assessors rejecting the offer to accept, however no further action is required from the Carriage 1. The monitoring of assignments, acceptance, rejection and submission is managed by ARC staff. For Linkage Projects specifically, as Proposals are submitted continuously through the year, the Carriage 1 only needs to assign Detailed Assessors to the latest Proposals, as indicated by the date in the RMS automated email. Previously assigned Proposals will remain in the Carriage 1 assignment list, and may appear to need further assignments, but the Carriage 1 should not assign any additional Detailed Assessors to these Proposals as ARC staff will manage this process.

Table 1: Overview of SAC format and assignment requirements for each scheme.

Scheme SAC Detail Assignment Detail

Discovery Early Career Researcher Award (DE19)

Discovery Projects (DP19)

SACs are comprised of CoE members and are made up of disciplinary panels:

- Biological Sciences and Biotechnology (BSB)

- Engineering, Information and Computing Science (EIC)

- Humanities and Creative Arts (HCA)- Mathematics, Physics, Chemistry and

Earth Sciences (MPCE) - Social, Behavioural and Economic

Sciences (SBE)

Interdisciplinary Proposals sit with Carriage 1’s panel and have one or more Carriages on other relevant panels

ED assigns 2 General AssessorsCarriage 1 assigns 4 Detailed Assessors and 4 Reserves

Discovery Indigenous (IN19)

One interdisciplinary SAC that includes CoE members and representatives selected from experts nominated by AIATSIS, Batchelor College and NATSIHEC

ED assigns 3 General AssessorsCarriage 1 assigns 4 Detailed Assessors and 4 Reserves

Linkage Infrastructure, Equipment and Facilities (LE19)

One interdisciplinary SAC ED assigns 2 General Assessors and 4 Detailed Assessors and 4 Reserves

In the case of proposals which are requesting $1 million or more from the ARC, EDs will assign 4 General Assessors and 4 Detailed Assessors and 4 Reserves

Linkage Projects (LP18)

The LP scheme is the only scheme to identify high and low-scoring Proposals to be fast-tracked, and these Proposals are not discussed at the selection meeting.SACs may be interdisciplinary or disciplinary, depending on the number of Proposals being assessed at the selection meeting/s.

ED assigns 3 General AssessorsCarriage 1 assigns 8 Detailed Assessors

Special Research Initiative: PFAS Remediation Research Program (SR18)

One interdisciplinary SAC that includes CoE members and representatives from CSIRO and Industry

ED assigns 2 General AssessorsCarriage 1 assigns 4 Detailed Assessors and 4 Reserves

General Assessment ProcessAll Assessors must declare any Conflicts of Interest (COI) and reject the assignment as soon as possible if a COI exists. This will assist in the timely re-assignment of Proposals. See Section 4.1 for further information on COIs.

Preliminary Assessment

Following the assignment process, while the Detailed Assessors are undertaking Assessments, General Assessors independently read and assess all their assigned Proposals against the relevant criteria, based on an A to E Rating Scale . These assessment scores can be saved in RMS (but not submitted), or recorded by the General Assessor in their own notes.

General Assessment ratings are not submitted until after Rejoinders have closed and General Assessors have reviewed all available Assessment information.

4Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

In the Rejoinder process, the comments from Detailed Assessors are provided anonymously to the Applicant. The Applicant then has an opportunity to provide a Rejoinder to address any issues raised by the Detailed Assessors.

Final Assessment

After the Rejoinder process has closed, General Assessors review the Detailed Assessors’ comments and scores and the Applicants’ Rejoinder text. Detailed Assessments and Rejoinders are intended to inform General Assessors’ scores and at this point General Assessors can review and if necessary, revise their preliminary ratings.

For Proposals that have a disagreement in scores between the General Assessors, Carriage 1 is responsible for contacting the other Carriage(s) and discussing their ratings. General Assessors are not required to agree on or align their scores for a Proposal, but if the scores are disparate they need to have an understanding of why their opinions differ. Following this discussion, final ratings should be submitted in RMS.

When all final ratings are submitted, RMS produces a ranked list of all Proposals ― see Section 2.3 for more details. This list is used at the SAC meeting to assist with the identification of Proposals that are of sufficient quality to be fundable. The ranking of Proposals on this list is not final and the meeting process provides several opportunities for the SAC to discuss and review all Proposals, as outlined below.

Inappropriate Assessments

If General Assessors are concerned about the appropriateness of any Assessment text from Detailed Assessors, or identify a COI, then they must contact the ARC by sending an email to the relevant scheme team via [email protected] as soon as possible. The ARC will investigate the concerns and decide whether an Assessment should be removed from the process. This happens in rare circumstances and requires the Executive General Manager’s approval.

If inappropriate Assessments are identified early in the Assessment process, the ARC may ask the assessor to amend their Assessment or assign an alternative assessor to the Proposal.

2.2 Detailed Assessors

Assignment of Proposals and RMS Profile

Proposals are assigned to Detailed Assessors using information from their RMS profile and using one of two methods:

Proposals are assigned by Carriage 1, a General Assessor on the Selection Advisory Committee (SAC) for a specific scheme round, or

Proposals are assigned by an Executive Director of the ARC.

Detailed Assessors’ RMS profiles play an essential role in the assignment process as they assist with the matching of Proposals with appropriately skilled Detailed Assessors. It is important that Detailed Assessors ensure that their RMS profile is up to date and contains the following details:

- Expertise text: Please outline your expertise briefly. The following format is suggested “My major area of research expertise is in a, b, c. I also have experience in research q, r, s. I would also be able to assess in the areas of x, y, z.”

- Field of Research (FoR) Codes: Please include between six and ten 6-digit FoR codes that reflect your key areas of expertise.

Note: Obligated Assessors (those who are Participants on an ARC Project currently receiving funding) are required to keep their RMS profile up to date and to undertake assessments as required in the relevant Funding Agreement for their Project.

Detailed Assessments

Detailed Assessors provide scores and written comments addressing the selection criteria on each Proposal and may be assigned a number of Proposals within their field of research or across a broader disciplinary area on the basis of their RMS profile expertise text and FoR codes. Detailed Assessors are asked to:

Complete in-depth Assessments of Proposals in RMS, providing scores and comments against scheme specific criteria (refer to Appendix 1 for Discovery Program schemes and Appendix 2 for Linkage Program schemes)

Identify the merits or otherwise of the Proposal with respect to the selection criteria set out in the Funding Rules Assess and score the Proposal for each selection criterion separately

5Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Provide advice about other aspects of the Proposal using the ‘Improvements’ or ‘Comments’ fields in RMS.

Detailed Assessor comments are made available to Applicants anonymously once a Proposal is open for a Rejoinder.

Detailed Assessors may receive Proposals to assess at any stage of the Assessment process due to late COIs being declared by other Assessors.

If a Detailed Assessor identifies a COI with an assigned Proposal this must be declared to the ARC and no further participation in the Assessment process should take place.

How to Ensure High Quality Detailed Assessments

Detailed Assessors are asked to provide high quality, constructive Assessments with the following:

Objective and professional comments Detailed comments on the merits or otherwise of the Proposal with respect to the selection criteria (one or two

sentences is not sufficient) Sufficient information to allow Applicants to provide a Rejoinder to comments about the Proposal Comments that align closely with ratings—for example, an ‘A’ rating should not be submitted if a Proposal is

assessed as being of limited merit against a criterion. Further, if a ‘D’ rating is given, then suitable constructive criticisms and comments justifying the rating are required. It is important to remember that Applicants see only the comments and the SAC will see both comments and scores. It is essential that your scores and comments are fit for purpose and provide appropriate information for the person using them

Comments that are fair, meaningful and balanced, addressing only issues relevant to the Proposal in terms of the selection criteria. Comments should provide a sound, comprehensive account of, and justification for views about the Proposal, while respecting the care with which Proposals have been prepared

Comments free from platitudes, exaggeration and understatement Timely submission via RMS by the ARC deadline.

Detailed Assessors can refer to the ARC Peer Review webpage for examples of good Detailed Assessments.

How to Avoid Inappropriate Assessments

Detailed Assessors are asked to avoid the following in their Assessment comments as this may render the Assessment inappropriate:

Acronyms Generic comments used in multiple Assessments Very brief Assessment text Scores which do not align with Assessment text Scores that are included within the Assessment text Information that identifies researchers named on other Proposals Advice about their own identity, standing in, or understanding of, the research field in the Proposal The outcome or status of relevant research not mentioned in the Proposal Restatement or rephrasing of any part of the Proposal Comments about the potential ineligibility of a Proposal (this information should be provided to ARC-

[email protected] (General Assessors) or [email protected] (Detailed Assessors) by email) Comments comparing one Proposal with another Text that has been copied from a previous Assessment Text that appears to be discriminatory, defamatory or distastefully irrelevant (such as gratuitous criticism of a

researcher and/or Eligible Organisation).

Under no circumstances should Detailed Assessors contact researchers and/or institutions about a submitted Proposal or seek additional information from any sources.

Treatment of Inappropriate Assessments Inappropriate Assessments compromise the integrity of the peer review process. To be fair to all Applicants, the ARC will reject Assessments with inappropriate or highly subjective comments from individual assessors about any aspect of the Proposal. If the ARC considers an Assessment to be inappropriate, the ARC may request that an Assessor amend the Assessment or may remove the Assessment from the process.

6Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

2.3 Rating and Ranking Assessments

Rating ScalesWhen applying the Rating Scales, Assessors should have regard for the specific scheme objectives (see Appendices 1 and 2).

Scoring Proposals against selection criteria can be a difficult exercise when assessors might only look at a small sub-set of Proposals. Scoring bands within the Rating Scale ideally represent a distribution across all Proposals submitted to a scheme.

Only the very best Proposals should be recommended. As a guide, approximately 10% should fall into the top scoring band (‘A’). These would have been assessed as near flawless Proposals across all selection criteria.

A rubric for the Rating Scales A to E is provided in Table 2 below and should guide scoring by both Detailed and General Assessors.

Table 2: Example rating scale

Rating Scale Criteria Recommendation

AOutstanding: Of the highest quality and at the forefront of research in the field. Approximately 10% of Proposals should receive ratings in this band.

Recommended unconditionally

BExcellent: Of high quality and strongly competitive. Approximately 15% of Proposals should receive ratings in this band.

Strongly support recommendation of funding

C Very Good: Interesting, sound and compelling. Approximately 20% of Proposals should receive ratings in this band.

Support recommendation of funding with reservation

D Good: Sound, but lacks a compelling element. Approximately 35% of Proposals are likely to fall into this band.

Unsupportive of recommendation for funding

E Uncompetitive: Uncompetitive and has significant weaknesses. Approximately 20% of Proposals are likely to fall into this band.

Not recommended for funding

NOTE: This Rating Scale is an example only. Please see Appendices 1 and 2 for the Rating Scales applicable to each individual scheme.

RankingDetailed and General Assessors who have been assigned multiple Proposals must establish a ranked list. Assigning scores to each Assessment is a convenient way of ranking Proposals. RMS will use your scores to automatically rank Proposals.

Each Proposal must have a unique rank, therefore Assessors who have multiple Assessments with an identical final rank are prompted by RMS to give each Assessment a unique rank to differentiate between them. Differentiation should be based on how you compare the Proposals in relation to the Rating Scale.

Detailed Assessors cannot leave a criterion score blank for any reason. Assessments can only be submitted when all Proposals have been assigned 1) a score and 2) a unique ranking.

Scores are normalised for all schemes except for smaller schemes such as Discovery Indigenous, Linkage Infrastructure, Equipment and Facilities, Industrial Transformation Research Hubs, Industrial Transformation Training Centre, Special Research Initiatives and Linkage Projects.

2.4 Important Factors To Consider When Assessing All Assessors of ARC Proposals must have regard to the following when undertaking Assessments.

Objectives and Selection Criteria

Each scheme has specific objectives and selection criteria that aim to ensure funded Proposals achieve the best possible outcomes for the scheme. Assessors must have regard to both the objectives and the selection criteria as outlined in the Funding Rules and Appendices 1 and 2 of this document.

7Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Research Opportunity and Performance Evidence (ROPE)The ROPE selection criterion requires assessors to identify and consider research excellence relative to a researcher’s career and life experiences. It aims to ensure that NCGP assessment processes accurately evaluate a researcher’s career history relative to their current career stage, and considers whether their productivity and contribution is commensurate with the opportunities that have been available to them.

The required elements of ROPE vary according to the objectives of each scheme. All General and Detailed Assessors should be familiar with the full ROPE statement located on the ARC website.

Research ImpactThe Research Impact Principles and Framework provided on the ARC website provides a definition of research impact and examples of where research components fit into an impact pathway. You should include Applicants’ information about the intended benefit of their Project when assessing a Proposal against a feasibility and benefit selection criterion.

Interdisciplinary ResearchThe ARC recognises the value of interdisciplinary research and the ARC’s commitment to supporting interdisciplinary research is outlined in the ARC Statement of Support for Interdisciplinary Research.

Interdisciplinary research can be a distinct mode of research, or a combination of researchers, knowledge and/or approaches from disparate disciplines. Under the NCGP, examples of interdisciplinary research may include researchers from different disciplines working together in a team, researchers collaborating to bring different perspectives to solve a problem, researcher(s) utilising methods normally associated with one or more disciplines to solve problems in another discipline and one or more researchers translating innovative blue sky or applied research outcomes from one discipline into an entirely different applied research discipline.

Assessors are required to assess all research on a fair and equal basis, including Proposals and outputs involving interdisciplinary and collaborative research. To assist with this, the ARC facilitates consideration of Proposals by relevant General Assessors with interdisciplinary expertise or where not feasible, Proposals are allocated to General Assessors who have broad disciplinary expertise regardless of discipline grouping. Interdisciplinary Proposals should be allocated to Detailed Assessors with specific interdisciplinary expertise or from different disciplines.

Data Management RequirementsIn line with responsibilities outlined in the Australian Code for the Responsible Conduct of Research (2007) and international best practice, the ARC encourages researchers to deposit data arising from research projects in appropriate publicly accessible repositories.

The Project Description section of a Proposal requires researchers to outline briefly their plans for the specific management of data generated through the proposed Project. In answering this question, researchers are not asked to include extensive detail of the physical or technological infrastructure. However, a general compliance response is not helpful. Assessors must consider how the researcher plans to make data as openly accessible as possible for the purposes of verification and for others’ future research. Where it is inappropriate to disseminate or re-use data, assessors must consider the validity and timeliness of any justification provided.

3. General Assessors: Selection Advisory Committee (SAC) Meeting Preparation

3.1 Roles and Responsibilities before the SAC MeetingAfter the Assessment period has closed General Assessors will:

be unable to access Proposals for a short period (3-5 days) whilst ARC staff undertake administrative functions to prepare for the SAC meeting

after 3-5 days, have access to all Proposals in the RMS Meeting App where they do not have a COI. be required to attend a pre-meeting videoconference to learn about the uncertainty bands, funding line, and

estimated return and success rates.

Carriage 1: Reviewing Proposals in the Meeting AppPrior to the SAC meeting, Carriage 1 should review the Detailed and General Assessors’ Assessments and scores and consider whether they believe there are Proposals:

in the fundable range or uncertainty band that should be lower; or8

Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

below the uncertainty band that should be higher; or in the fundable range that should/should not be considered for funding?

Particular attention should be given to Proposals where a Research Opportunity and Performance Evidence (ROPE) case (see Section 4.1) has been made that has been neglected by Detailed Assessors, or where an anomalous Detailed Assessment may materially affect the ranking of the Proposal. Carriage 1 should identify such Proposals and prepare a recommendation for consideration by the Selection Committee.

ARC staff will also identify Proposals with ‘disparate’ scores and will flag these for discussion at the SAC meeting. Carriage 1 will be expected to lead discussion on these Proposals.

Carriage 1: Prepare a one-line budget If a Proposal is recommended for funding it is Carriage 1’s responsibility to recommend an overall, one-line budget amount for each funding year of the Proposal. The budget recommendation is made at the SAC meeting following each recommendation for funding.

The budget recommended for each year must not exceed the amount requested in the Proposal. Budget recommendations are discussed by the SAC members and the recommended budget is forwarded to the ARC CEO as part of the SAC’s funding recommendations.

Prior to the SAC meeting, all General Assessors assigned to a Proposal should discuss and agree on an appropriate budget. Carriage 1 may need to discuss or justify their budget recommendation at the SAC meeting, and should therefore bring their own notes to the meeting on how they arrived at their final recommended funding amount.

To prepare a one-line budget for each year of funding, Carriage 1 should consider:

The extent to which specific budget items are well justified Whether the budget items are supported or not supported as outlined in relevant scheme’s Funding Rules The minimum/maximum funding amounts relevant to the specific scheme’s Funding Rules The costs of any recommended remunerated Participants.

All Carriages and SAC membersPrior to the SAC meeting, all members are advised of the Proposals that fall into the ‘uncertainty band’. SAC members are requested to briefly review their Proposals within this group and flag any to be discussed at the SAC meeting.

SAC members are welcome to read any Proposal accessible through the RMS Meeting App and contribute to discussion of that Proposal during the meeting.

3.2 Roles and Responsibilities at the SAC MeetingEach SAC Meeting will comprise a Chair, SAC members (Carriage 1, other Carriages and panel members); and ARC Staff. SAC Meetings may also be divided into discipline panels, depending on the scheme.

The role of the Chair is to:

lead the committee through the process to recommend Proposals for Funding call the panel to a vote for Proposals within the uncertainty band or where there is dissent ensure the meeting runs in a timely manner.

For Proposals where the Chair is conflicted out of the room or is Carriage on a Proposal, the Deputy Chair will act in the role.

When you are Carriage 1 on a Proposal, your role is to:

lead discussion for that Proposal vote on Proposals when called by the Chair provide a one-line budget for Proposals that are recommended for funding for the Linkage Projects scheme, where required, confirm that a Proposal is suitable for fast-tracking.

All other Carriages will:

contribute to discussions of Proposals vote on Proposals when called by the Chair.

ARC staff are responsible for:

9Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

providing secretariat support for meetings providing procedural advice to selection panels ensuring that correct administrative procedures are followed ensuring COIs and inappropriate discussions are managed correctly as Probity Officer.

3.3 Attendance ArrangementsRemuneration

After the SAC Meeting, members will be paid at the rates set out in their contract. All payments are subject to certification by a Director within the ARC’s Programs Branch that the member has carried out all required duties.

All sitting fees and reading rates are paid directly to the SAC member’s institution, as outlined in the contract. Travel allowance is paid directly to the SAC member.

Travel and Accommodation for SAC meetings

All travel and accommodation provided by the ARC must meet the requirements of the Whole of Australian-Government Travel Arrangements.

Air TravelThe ARC will organise air travel for SAC members to attend Selection Advisory Committee Meetings. A member of staff will contact SAC members prior to each meeting to make these arrangements.

Motor Vehicle TravelMembers may be authorised to use their own vehicle for official travel, however, prior approval must be obtained from a Director within the ARC’s Programs Branch. If approved, the ARC Meeting and Logistics team will provide advice on claiming a motor vehicle allowance.

Please note: the Commonwealth Government does not provide, nor carry any insurance, on private vehicles.

Hire VehicleMembers may be authorised to hire a motor vehicle for official travel, however, prior approval must be obtained from a Director within the ARC’s Programs Branch. The ARC Meeting and Logistics team will arrange the hire car through the Whole-of-Australian-Government Travel Services.

Under no circumstances are members to engage the use of limousine services.

Taxi FaresSAC members are entitled to use a Cabcharge card only for official ARC business. The ARC Meeting and Logistics team will arrange a Cabcharge card for each SAC member, where required. Cabcharge card holders are required to:

advise the driver on commencing the journey that they wish to use a Cabcharge card. If this is not accepted, a receipt should be obtained and reimbursement sought from the ARC. A request to pay form will be provided for this purpose

ensure that only the required fare is charged—Cabcharge cards must not be used to ‘tip’ drivers obtain a receipt for the journey and provide this receipt to the ARC on request.

Lost cards should be notified immediately to the ARC - Meeting and Logistics Team.

Should a Cabcharge card be used incorrectly or for private travel, the ARC will seek to recover the costs.

Travel Allowance and AccommodationThe ARC will organise accommodation for SAC members attending SAC meetings.

SAC members are entitled to a travel allowance according to the Remuneration Tribunal Determination Rates. When a meal or accommodation is provided by the ARC or other organisation, the travel allowance will be adjusted accordingly. The ARC will advise on total entitlements for specific visits or journeys. A copy of this determination will be attached to the contract.

10Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

4. Ensuring Integrity of Process

4.1 Confidentiality and conflict of interest (COI)The ARC Conflict of Interest and Confidentiality Policy is designed to ensure that all COIs are managed in a rigorous and transparent way. It aims to prevent individuals from influencing decisions unfairly and to maintain public confidence in the integrity, legitimacy, impartiality and fairness of the peer review process.

Any individual who is reviewing material for the ARC must agree to a confidentiality and COI statement, and must clearly disclose any material personal interests that may affect or might be perceived to affect, their ability to perform their role.

Examples of material personal interests that are considered by the ARC to be COIs include holding funding with a named Participant within the past two years, or having been a collaborator or co-author with a named Participant on a research output within the last four years. For more information on the timeframes that apply for common COIs, please refer to the Identifying and Handling a Conflict of Interest in NCGP processes document.

4.2 Research integrity and research misconductIf in the course of undertaking an Assessment you identify or suspect a potential research integrity breach or research misconduct, please notify the ARC Research Integrity Office ([email protected]) in accordance with Section 5 of the ARC Research Integrity and Research Misconduct Policy. Please do not mention your concerns in your Assessment comments.

The ARC Research Integrity Office will consider whether to refer your concerns to the relevant institution for investigation in accordance with the requirements of the Australian Code for the Responsible Conduct of Research (2007). You should provide sufficient information to allow the ARC to assess whether there is a basis for referring the matter to the institution; and to enable the relevant institution to progress an investigation into the allegation (if required).

4.3 Proposals outside an assessor’s area of expertiseThe ARC receives Proposals from many scholarly fields. Occasionally you will be asked to assess a Proposal that does not appear to correspond closely with your area of expertise, particularly if you are a General Assessor. Your views are valuable as they are being sought on the entire Proposal, drawing on your expert knowledge as a researcher. If you are a General Assessor and are concerned that a Proposal is outside your area of expertise, please contact the relevant scheme team via [email protected] before rejecting.

If you are a Detailed Assessor and believe that the ARC has misunderstood your expertise, or has made an error in assigning a Proposal to you, please give early notice of your view by rejecting the applicable Proposal/s in RMS and entering a reason in the Reject Reason comment box. It is also important to review your RMS profile expertise text and FoR codes.

4.4 EligibilityIf, while assessing a Proposal, you have concerns about eligibility, ethics or other issues associated with a Proposal, you must not include this information in your Assessment. Please send an email highlighting your concerns to the relevant scheme team via [email protected] (General Assessors) or [email protected] (Detailed Assessors) as soon as possible. The ARC is responsible for investigating and making decisions on these matters, and General and Detailed Assessors should not conduct investigations at any point. Please complete your Assessment based on the merits of the Proposal without giving consideration to the potential eligibility issue.

4.5 Unconscious biasAssessors should also be aware of how their unconscious bias could affect the peer review process.

Unconscious biases are pervasive and may relate to perceptions about a range of attributes including:

gender career path institutional employer discipline

The ARC encourages Assessors to recognise their own biases and be aware of them in their Assessments. A selection of short, online tests for identifying unconscious biases is available via Harvard University’s ‘Implicit Social Attitudes’ demonstration sites.

11Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

5. Questions during the Assessment ProcessFor all Assignment and Assessment as well as accessibility enquiries please email the relevant scheme team via [email protected] (General Assessors) or [email protected] (Detailed Assessors).

For all questions relating to the SAC and SAC meetings, contact [email protected].

12Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Appendix 1: Discovery Program Rating Scale and Selection Criteria Considerations

Please note: Assessors assign a score and do not have to consider the weighting of a criterion as this is applied automatically within RMS. The table below provides ready access to Selection Criteria set out in the Funding Rules for schemes under the Discovery Program (2017 Edition) and the Rating Scales outlined in this handbook. Assessors should use their judgement and experience to assess the appropriate rating within the context of the relevant discipline.

For reference, the current Science and Research Priorities are: Food, Soil and Water, Transport, Cybersecurity, Energy, Resources, Advanced Manufacturing, Environmental Change and Health.

Discovery Early Career Researcher Award (DE19)

ObjectivesThe objectives of the Discovery Early Career Researcher Award scheme are to:

a. support excellent basic and applied research by early career researchers b. advance promising early career researchers and promote enhanced opportunities for diverse career pathways c. enable research and research training in high quality and supportive environments d. expand Australia’s knowledge base and research capability e. enhance the scale and focus of research in the Science and Research Priorities.

14Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Selection Criteria and Rating Scales – DECRASelection criterion (A)

Outstanding: Of the highest quality and at

the forefront of research in the field.

Approximately 10% of Proposals should

receive ratings in this band.

(B)Excellent: Of high quality and strongly

competitive. Approximately 15% of Proposals should receive ratings in this

band

(C)Very Good:

Interesting, sound and compelling.

Approximately 20% of Proposals should

receive ratings in this band

(D)Good: Sound, but lacks a compelling

element. Approximately 35% of Proposals are likely to

fall into this band.

(E)Uncompetitive: Has

significant weaknesses.

Approximately 20% of Proposals are likely to

fall into this band.

The Selection Criteria below are from the Funding Rules for schemes under the Discovery Program (2017 edition) which are available on the ARC website.

Selection Criteria and weightings

Selection Criteria Details

Proposed Project Quality and Innovation: 35%

Does the research address a significant problem? Is the conceptual/theoretical framework innovative and original? What is the potential for the research to contribute to the Science and Research Priorities? Will the aims, concepts, methods and results advance knowledge?

DECRA Candidate: 40% Research opportunity and performance evidence (ROPE) Time and capacity to undertake the proposed research.

Feasibility: 10% Do the Project’s design, participants and requested budget create confidence in the timely and successful completion of the Project? Is there an existing, or developing, supportive and high quality environment for this Candidate, their Project and for Higher Degree by

Research students where appropriate? Are the necessary facilities available to complete the Project?

Benefit and Collaboration:15%

Will the completed Project produce significant new knowledge and/or innovative economic, commercial, environmental, social and/or cultural benefit to the Australian and international community?

To what extent will the DECRA Candidate build collaborations across research organisations and/or industry and/or with other disciplines both within Australia and internationally?

Will the proposed research be cost-effective and value for money?

15Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Discovery Indigenous (IN19)

ObjectivesThe objectives of the Discovery Indigenous scheme are to:

a. support excellent basic and applied research and research training by Indigenous researchers as individuals and as teams b. develop the research expertise of Indigenous researchers c. support and retain established Indigenous researchers in Australian higher education institutions d. expand Australia’s knowledge base and research capability.

16Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Selection Criteria and Rating Scales – Discovery IndigenousSelection criterion (A)

Outstanding: Of the highest quality and at the forefront of

research in the field. Approximately 10% of

Proposals should receive ratings in this band.

(B)Excellent: Of high quality and strongly competitive.

Approximately 15% of Proposals should receive

ratings in this band

(C)Very Good: Interesting, sound and compelling. Approximately 20% of

Proposals should receive ratings in this band

(D)Good: Sound, but lacks a

compelling element. Approximately 35% of

Proposals are likely to fall into this band.

(E)Uncompetitive: Has

significant weaknesses. Approximately 20% of

Proposals are likely to fall into this band.

The Selection Criteria below are from the Funding Rules for schemes under the Discovery Program (2017 edition) which are available on the ARC website.

Selection Criteria and weightings

Selection Criteria Details

Proposed Project Quality and Innovation: 40%

Does the research address a significant problem? Is the conceptual/theoretical framework innovative and original? Will the aims, concepts, methods and results advance knowledge? What is the potential for the research to contribute to the Science and Research Priorities?

Investigator(s): 35% Research opportunity and performance evidence (ROPE) Time and capacity to undertake the proposed research.

Feasibility: 10% Are there strategies for enabling collaboration with Australian Aboriginal and Torres Strait Islander communities where appropriate (for example, dialogue/collaboration with an Indigenous cultural mentor)?

Is there an existing or developing, supportive and high quality research community? Are the necessary facilities available to complete the Project? Is the design of the Project and the expertise of the participants sufficient to ensure the Project can be completed within the proposed budget and

timeframe? Benefit and Collaboration: 15%

Will the completed Project produce significant new knowledge and/or innovative economic, commercial, environmental, social and/or cultural benefit to the Australian and international community?

How will Host Organisations be utilised in the proposed Project? To what extent will the Project build collaborations across research organisations and/or industry and/or with other disciplines both within Australia

and internationally? Will the proposed research be cost-effective and value for money?

17Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Scoring band for a Discovery Australia Aboriginal and Torres Strait Islander Award (DAATSIA) candidate

Proposals that nominate a DAATSIA candidate must demonstrate how the Project quality would be enhanced by a DAATSIA and detail the ways in which the additional research time would be utilised (for example, undertaking sustained field research, archival research or laboratory work.)

Proposals that include a proposed DAATSIA candidate require a numerical score to be entered. There is a separate field (additional to the above criteria) for a score to be entered for the DAATSIA candidate. This field requires a score (0–100) specifically for the DAATSIA candidate, assessing their strength against the DAATSIA scoring bands.

During the Assessment of the DAATSIA, please consider subsection E6.3.4 of the Funding Rules which states that a DAATSIA candidate must demonstrate how a DAATSIA would enhance the Project quality and detail the ways in which the additional research time would be used (for example, undertaking sustained field research, archival research or laboratory work).

DAATSIAs are not fellowships and cannot be funded/held independently of a Discovery Indigenous Project. Accordingly, DAATSIA scores have no separate weighting in the overall score for a Proposal. A DAATSIA candidate makes up part of the team (they will remain as a CI if the DAATSIA is not awarded), and as such they should also be assessed as part of the team under the ‘Investigator(s)’ selection criterion, weighted at 35%.

The next table shows appropriate ranges for DAATSIA numerical scores depending on the merits of the DAATSIA candidate.

Scoring band (DIA)

Criteria Recommendation

91-100 Outstanding: Will benefit, enhance and expedite the Project, ensuring it is of the highest quality and at the forefront of research activity.

RecommendedUnconditionally

86-90 Excellent: Will benefit, enhance and expedite the Project, ensuring high quality research. Strongly support recommendation of funding

76-85 Very Good: Will benefit, enhance and expedite the Project. Interesting, sound and compelling.

Conditionally support recommendation of funding

51-75 Good: Sound, but lacks a compelling element.Unsupportive of recommending for funding

0-50 Uncompetitive: Uncompetitive and has significant weaknesses.Not recommended for funding

18Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Discovery Projects (DP19)

ObjectivesThe objectives of the Discovery Projects scheme are to:

a. support excellent basic and applied research by individuals and teams b. encourage high-quality research and research training c. enhance international collaboration in research d. expand Australia’s knowledge base and research capability e. enhance the scale and focus of research in the Science and Research Priorities.

19Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Selection Criteria and Rating Scales – Discovery ProjectsSelection criterion (A)

Outstanding: Of the highest quality and at

the forefront of research in the field.

Approximately 10% of Proposals should

receive ratings in this band.

(B)Excellent: Of high quality and strongly competitive.

Approximately 15% of Proposals should receive

ratings in this band

(C)Very Good: Interesting, sound and compelling. Approximately 20% of

Proposals should receive ratings in this

band

(D)Good: Sound, but lacks a compelling element. Approximately 35% of Proposals are likely to

fall into this band.

(E)Uncompetitive: Has

significant weaknesses. Approximately 20% of

Proposals are likely to fall into this band.

The Selection Criteria below are from the Funding Rules for schemes under the Discovery Program (2017 edition) which are available on the ARC website.

Selection Criteria and Weightings

Selection Criteria Details

Investigator(s): 35% Research opportunity and performance evidence (ROPE) Evidence of research training, mentoring and supervision Evidence of ability to build international linkages Time and capacity to undertake the proposed research.

Proposed Project Quality and Innovation: 40%

Does the research address a significant problem? Is the conceptual/theoretical framework innovative and original? What is the potential for the research to contribute to the Science and Research Priorities? Will the aims, concepts, methods and results advance knowledge? What is the potential for the research to enhance international collaboration?

Feasibility: 10% Do the Project’s design, participants and requested budget create confidence in the timely and successful completion of the Project? Is there an existing, or developing, supportive and high quality environment for this Project and for Higher Degree by Research

students where appropriate? Are the necessary facilities available to complete the Project?

Benefit: 15% Will the completed Project produce significant new knowledge and/or innovative economic, commercial, environmental, social and/or cultural benefit to the Australian and international community?

Will the proposed research be cost-effective and value for money?

20Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Appendix 2 Linkage Program Rating Scale and Selection Criteria Considerations

Please note: Assessors assign a score and do not have to consider the weighting of a criterion as this is applied automatically within RMS. The table below provides ready access to Selection Criteria set out in the Funding Rules for schemes under the Linkage Program (2017 Edition) for LP18 and LE19 and the Linkage Program - Special Research Initiative: PFAS (per- and poly-fluoroalkyl substances) Remediation Research Program Grant Guidelines for SR18 and the Rating Scales outlined in this handbook. Assessors should use their judgement and experience to assess the appropriate rating within the context of the relevant discipline.

21Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Linkage Projects (LP18)

ObjectivesThe objectives of the Linkage Projects scheme are to:

a. support the initiation and/or development of long-term strategic research alliances between higher education organisations and other organisations, including industry and other research end-users, in order to apply advanced knowledge to problems and/or to provide opportunities to obtain national economic, commercial, social or cultural benefits

b. provide opportunities for internationally competitive research projects to be conducted in collaboration with organisations outside the higher education sector, targeting those who have demonstrated a clear commitment to high-quality research

c. encourage growth of a national pool of world-class researchers to meet the needs of the broader Australian innovation system d. build the scale and focus of research in the national Science and Research Priorities.

22Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Selection Criteria and Rating Scales – Linkage ProjectsSelection Criteria (A)

Outstanding: Of the highest quality and at the forefront of

research in the field. Approximately 10% of

Proposals should receive ratings in this band.

(B)Excellent: Of high quality and strongly competitive.

Approximately 15% of Proposals should receive

ratings in this band

(C)Very Good: Interesting, sound and compelling. Approximately 20% of

Proposals should receive ratings in this band

(D)Good: Sound, but lacks a

compelling element. Approximately 35% of

Proposals are likely to fall into this band.

(E)Uncompetitive: Has

significant weaknesses. Approximately 20% of

Proposals are likely to fall into this band.

The Selection Criteria below are from the Funding Rules for schemes under the Linkage Program (2017 edition) which are available on the ARC website.

Selection Criteria and weightings

Selection Criteria Details

Investigator(s): 25% Research Opportunity and Performance Evidence (ROPE) Potential to engage in collaborative research with end-users Evidence of research training, mentoring and supervision Time and capacity to undertake and manage the proposed research in collaboration with the Partner Organisation(s).

Project Quality and Innovation: 25%

Significance and Innovation Will new methods or technologies be developed that address a specific market opportunity? How will the anticipated outcomes advance the knowledge base and/or address an important problem and/or provide an end-user and/or industry

advantage? Does the Project plan provide a business model for implementation? Does the proposed Project address the Science and Research Priorities? Are the proposed Project aims and concepts novel and innovative? Does the proposed Project significantly enhance links with organisations outside the Australian publicly-funded research and higher education

sectors? Approach and Training Are the conceptual framework, design, methods and analyses adequately developed, well integrated and appropriate to the aims of the proposed

Project? Where relevant, is the intellectual content and scale of the work proposed appropriate to a higher degree by research?

Feasibility: 20% Is there an existing, or developing, supportive and high-quality environment for this research both within the Administering Organisation and in the Partner Organisation(s)?

Are the necessary facilities available to conduct the proposed research? Is there evidence that each of the Partner Organisation(s) is genuinely committed to, and prepared to collaborate in, the proposed research

Project? How adequate are the Cash and in-kind Contributions?

Benefit: 30% How will the proposed Project benefit Partner Organisation(s) and other relevant end-users? Will the proposed research encourage and develop strategic research alliances between the higher education organisation(s) and other

organisation(s)? Will the proposed research maximise economic, commercial, environmental and/or social benefit to Australia? Are there adequate strategies to

23Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Selection Criteria and weightings

Selection Criteria Details

encourage dissemination, commercialisation, if appropriate, and promotion of research outcomes? Is it demonstrated that, where relevant, the applicants have identified the freedom to operate in the Intellectual Property and patent landscape to

enable future benefits to end-users and/or industry? Does the proposed Project represent value for money?

24Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Linkage Infrastructure, Equipment and Facilities (LE19)

ObjectivesThe objectives of the Linkage Infrastructure, Equipment and Facilities scheme are to:

a. encourage Eligible Organisations to develop collaborative arrangements with other Eligible Organisations and/or Partner Organisations to develop and support research infrastructure

b. support large-scale national or international cooperative initiatives allowing expensive research infrastructure to be shared and/or accessed c. support areas of existing and/or emerging research strength d. support and develop research infrastructure for the broader research community.

25Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Selection Criteria and Rating Scales – Linkage Infrastructure, Equipment and FacilitiesSelection Criteria (A)

Outstanding: Of the highest quality and at

the forefront of research in the field.

Approximately 10% of Proposals should

receive ratings in this band.

(B)Excellent: Of high quality and strongly competitive.

Approximately 15% of Proposals should receive

ratings in this band

(C)Very Good: Interesting, sound and compelling. Approximately 20% of

Proposals should receive ratings in this band

(D)Good: Sound, but lacks a

compelling element. Approximately 35% of

Proposals are likely to fall into this band.

(E)Uncompetitive: Has

significant weaknesses. Approximately 20% of

Proposals are likely to fall into this band.

The Selection Criteria below are from the Funding Rules for schemes under the Linkage Program (2017 edition) which are available on the ARC website.

Selection Criteria and weightings

Selection Criteria Details

Project Quality and Innovation: 25%

Nature of the research, including aims and significance Relevance of the proposed research infrastructure to the needs of ARC and other competitively funded research projects/programs Enhancement of support for areas of existing and/or emerging research strength Demonstrated national or international focus for large scale cooperative initiatives.

Feasibility: 25% Relevance of the research to the strategic priorities of the organisations Evidence that each of the organisations is genuinely committed to, and prepared to collaborate in, the proposed Project Existing or planned strategic research alliances between the higher education organisation(s) and other organisation(s) Effectiveness of cooperative arrangements for the management and sharing of the proposed research infrastructure, including

arrangements for ongoing operational expenditure where applicable.Investigator(s): 20% Track record of investigators relevant to the use of the proposed research infrastructure, with consideration given to Research

Opportunity and Performance Evidence (ROPE) For CIs and PIs who will manage the purchase, design, manufacture, installation, maintenance and coordination of access to the

proposed research infrastructure, a demonstrated record in these activities Evidence of research training, mentoring and supervision Relevance of the research infrastructure to the research capacity and planned activities of each CI and PI on the Proposal and, where

relevant, to the research groups represented on the Proposal.Benefit: 30% Availability of and access to similar research infrastructure at organisational, regional, national and/or international level

Demonstrated needs from the researchers and/or research projects that they will utilise the proposed research infrastructure, including level of demand and likely measurable impact on the research program, including beyond the Project Activity Period

Value for money and budget justification for cash and in-kind contributions, and the expected rate of use of the proposed research Infrastructure

Planned use of the proposed research infrastructure, including proposed arrangements for broader access to individuals not named on the Proposal and the alignment of this planned use with other similar existing infrastructure within Australia and/or internationally

Plans to ensure that publicly funded research data generated from LIEF infrastructure is made open Special needs for regional or otherwise remote institutions

26Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Selection Criteria and weightings

Selection Criteria Details

Benefit of the proposed research infrastructure to the national research community Is it demonstrated, that where relevant, the applicants have identified the freedom to operate in the Intellectual Property and patent

landscape to enable future benefits to industry?

27Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Special Research Initiative: PFAS Remediation Research Program (SR18)

ObjectivesThe objectives of the Special Research Initiative: PFAS (per- and poly-fluoroalkyl substances) Remediation Research Program (SR18) are to:

a. develop technologies which will do one or more of the following:- immobilise PFAS and prevent its movement in the environment- remove PFAS from soil, water and/or waste- breakdown PFAS by destroying the carbon-fluorine bonds, to form environmentally benign products with limited adverse environmental outcomes

b. establish the potential for relevant technologies developed to be deployed in the field to treat contaminated soil, waterways, waste, debris and/or large volumes of groundwater

c. develop technologies that can be scaled up for efficient and effective field deployment.

28Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Selection Criteria and Rating Scales – Special Research Initiative: PFAS Remediation Research Program (SR18)Selection Criteria (A)

Exceptional: Of the highest quality and at the forefront of research in the field. Approximately 10% of Proposals should receive ratings in this band.

(B)Outstanding: Of high quality and strongly competitive. Approximately 15% of Proposals should receive ratings in this band.

(C)Excellent: Interesting, sound and compelling. Approximately 20% of Proposals should receive ratings in this band.

(D)Very Good: Sound, but lacks a compelling element. Approximately 35% of Proposals are likely to fall into this band.

(E)Good: Has significant weaknesses. Approximately 20% of Proposals are likely to fall into this band.

The Selection Criteria below are from the Linkage Program—Special Research Initiative: PFAS (per- and poly-fluoroalkyl substances) Remediation Research Program Grant Guidelines which are available on the ARC website.

Selection Criteria and weightings

Selection Criteria Details

Research project—quality and innovation: 30%

How will the proposed research Project address the objectives of the PFAS Remediation Research Program? How will the Project address one or more of the research priority areas of the PFAS Remediation Research Program? Addressing multiple

PFAS Research Priority Areas will strengthen the application. How will the Project lead to an advancement of PFAS knowledge, expertise and technologies? How will the Project monitor effective remediation of PFAS? What are the expected volumes of PFAS contaminated materials that the technology will be able to process? What is the likely treatment efficiency of the proposed technology? How will the treatment efficiency be measured under different treatment

conditions and scenarios?People and partners:30%

What will be the contribution of the Chief Investigators and Partner Investigators to the Project and do they have appropriate capacity, capability and commitment to the Project?

Why are the investigators and partners suitable and relevant to the conduct and delivery of the Project, including Research Opportunity and Performance Evidence (ROPE)?

How will the Project be managed in collaboration with the Participating Organisation(s) and what is their role?Feasibility: 20% How will the Project be completed within the proposed budget and timeframe? This should include identified risks and mitigation strategies.

What evidence is there that each of the Participating Organisation(s) is genuinely committed to, and prepared to participate in, the Project, including through in-kind or funding support?

Are the necessary facilities and contaminated materials available to conduct the Project? What is an appropriate organisational, management and governance structure for the Project? How will the proposed conceptual framework, design, methods and analyses, Project structure and risk mitigation strategies be developed,

integrated and appropriate to the aims of the Project? How will the Project framework ensure any identified remediation technologies will be independently verified? For destructive technologies, does the project plan identify how PFAS destruction will be analytically proven, and how potential production of

environmentally harmful by-products will be assessed?29

Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Selection Criteria and weightings

Selection Criteria Details

Benefit and outcomes: 20%

Will the Project maximise environmental benefit to Australia and minimise economic imposts? How will partners and end-users be involved in the research, dissemination, translation and/or implementation of outcomes? How will the Project deliver value for money?

30Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Glossary

AIATSIS means the Australian Institute of Aboriginal and Torres Strait Islander Studies

Applicant means the Administering Organisation.

ARC means the Australian Research Council, as established under the ARC Act.

ARC Act means the Australian Research Council Act 2001.

ARC College of Experts (CoE) means a body of experts of international standing appointed to assist the ARC to identify research excellence, moderate external assessments and recommend fundable Proposals.

ARC website means http://www.arc.gov.au .

Carriage 1 means the General Assessor with the primary responsibility for the Proposal, and who is responsible for the assignment of Detailed Assessors to the Proposal for most schemes.

Chief Executive Officer (CEO) means the person holding the position of ARC Chief Executive Officer in accordance with the ARC Act or any person acting in that position.

Conflict of Interest (COI) means any conflict of interest, any risk of a conflict of interest and any apparent conflict of interest arising through a party engaging in any activity, participating in any association, holding any membership or obtaining any interest that is likely to conflict with or restrict that party participating in the Project. The ARC Conflict of Interest and Confidentiality Policy is available on the ARC website at www.arc.gov.au.

CSIRO means the Commonwealth Scientific and Industrial Research Organisation.

Detailed Assessment means an assessment process completed by the Detailed Assessor which involves an in-depth assessment of Proposals. A Detailed Assessment provides scores and comments against the scheme specific selection criteria. The Detailed Assessment is then taken into consideration by General Assessors (i.e. CoE or SAC members) in the later stages of the peer review process.

Detailed Assessors means assessors that are drawn from the Australian and international research community and are assigned Proposals to review for their specific expertise in a field of research. A Detailed Assessor completes in-depth assessments of Proposals by providing scores and comments against the scheme specific selection criteria.

Discovery Program means, for the purposes of eligibility, the schemes funded under the Discovery Program of the NCGP which consist of: Australian Laureate Fellowships, Discovery Early Career Researcher Award, Discovery Indigenous, Discovery Projects, Future Fellowships and other schemes as updated from time to time.

FoR Codes means Field of Research Codes as defined in the Australian Bureau of Statistics’ Australian and New Zealand Standard Research Classification (ANZSRC) (2008).

Fundable Range refers to Proposals that are highly ranked and are above the uncertainty band.

Funding Line is the estimated point in the ranked list of Proposals at which scheme funding would be completely allocated.

Funding Rules are Legislative Instruments, required by the ARC Act and approved by the Minister, outlining information for the relevant scheme/s relating to eligibility criteria,

31Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

application process, assessment process, and any other additional accountability requirements that the ARC considers necessary.

General Assessment means a review process completed by the General Assessor(s), taking into consideration the scores and comments provided by Detailed Assessors and the Applicant Rejoinder. Scores on each of the relevant scheme selection criteria are provided as part of the General Assessment.

General Assessors means the assessors appointed to a Selection Advisory Committee for each scheme round, often drawn from the ARC College of Experts. General Assessors contribute knowledge of their discipline areas and a broad understanding of intellectual and methodological issues and good research planning.

Linkage Program means the schemes funded under the Linkage Program of the NCGP which consists of: Linkage Projects, Linkage Infrastructure, Equipment and Facilities, Industrial Transformation Research Hubs, Industrial Transformation Training Centres, Special Research Initiatives, the ARC Centres of Excellence, Learned Academies Special Projects, Supporting Responses to Commonwealth Science Council Priorities and other schemes as updated from time to time.

NATSIHEC means the National Aboriginal and Torres Strait Islander Higher Education Consortium.

Other Carriage means the General Assessor with secondary or tertiary responsibility for the Proposal.

Participant means a person named as an Investigator on a Proposal.

Proposal means a request to the ARC for the provision of funding which is submitted in accordance with the relevant Funding Rules.

Rating Scale refers a set of guidelines provided to assessors on the degree of merit associated with particular scale in relation to the relevant scheme selection criteria.

Rejoinder means a process by which Applicants are given an opportunity to respond to Assessment comments made by external (Detailed) assessors via a written submission. Rejoinders are not viewed by external assessors but are considered by an ARC CoE Panel or SAC during the moderation and recommendation process.

RMS means the ARC Research Management System at https://rms.arc.gov.au. Further information on RMS can be found at http://www.arc.gov.au/rms-information .

RMS Meeting App refers to the RMS meeting application available to SAC members in preparation for/and at the selection meeting.

Selection Advisory Committee (SAC) means a group of experts from industry and/or academia appointed to assist the ARC to evaluate Proposals and to provide a recommendation for funding to the ARC.

Uncertainty Band refers to Proposals ranked within a defined range above and below the notional Funding Line. The number of Proposals in this band will vary depending on the size of the scheme.

32Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Frequently Asked Questions

Why do I have to keep changing my password for RMS?

The Australian Research Council is a Government entity and as such, our systems must comply with the whole of government security policy. This policy is set out by the Australian Signals Directorate and is publicly available here for you to access. The relevant controls can be found on page 193.

These polices are put in place to protect the information within Australian Government systems, including personal information relating to our ARC assessors. The increasing use of technology as a way of doing our business requires us to strengthen our information security.

What if I’m not sure if I have a conflict of interest or not?

The ARC’s Conflict of Interest and Confidentiality Policy provides guidance on conflicts. Further guidance is provided through Identifying and Handling Conflicts of Interest in NCGP processes. Where there is still doubt, assessors should email the relevant scheme team via [email protected] (General Assessors) or [email protected] (Detailed Assessors) for advice.

What if I pick up eligibility issues as part of my Assessment?

Eligibility is managed as a separate process to the peer review process. Any eligibility issues should be emailed to the relevant scheme team via [email protected] (General Assessors) or [email protected] (Detailed Assessors) for investigation. Assessments should be completed based on the merit of the Proposal. It is important not to include potential eligibility issues in Assessments.

Why can’t I see the ‘submit’ button?

The most common reason for the ‘submit’ button not to be showing is because the proposals you are reviewing have not been ranked. You must enter your rankings for each group of Proposals before they can be submitted.

Why have I lost the assessments I have been working on?

The most common reason for assessments to be lost is when an assessor has two sessions of RMS open at the same time. It is best practice to only have one session of RMS open at a time and to ensure you are save your assessments regularly.

RMS runs best with Google Chrome or Internet Explorer.

For General Assessors

When do I submit my assessments?

General Assessors should not submit any assessments until after the Detailed Assessments have been completed and Rejoinders have closed. You should also discuss your final scores with your other carriages before submitting.

33Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Why can’t I see the Detailed Assessments and Rejoinders?

You will not be able to view the Detailed Assessments or Rejoinders until those modules have been closed in RMS. You will be notified when you have access to the Detailed Assessments and Rejoinders.

Why can’t I assign a particular Detailed Assessor who I know is in RMS to a Proposal?

There are various reasons why a particular Assessor may not come up on the list of possible Detailed Assessors, including COIs or availability. You must assign Detailed Assessors that are on the list generated by RMS.

34Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0

Updated sections in Assessor Handbook

Version 4.0 updates

General updates throughout the document, including update of the document structure to support readability and accessibility.

35Assessor Handbook: A guide for both General and Detailed Assessors – Version 4.0