Share your expertise in the CNCF Annual Survey!
KubeCon + CloudNativeCon Europe
""

Submission Reviewer Guidelines

OVERVIEW

Thank you in advance for your efforts as a submission reviewer for KubeCon + CloudNativeCon Europe.

Below you will find guides to Sessionize’s evaluation modes in addition to best practices to use when reviewing your set of proposals. Please bookmark this page for easy reference. If you have any questions, please email the CNCF Content Team.

By participating as a reviewer, you agree to the CNCF Program Committee and Events Volunteer Guidelines.

IMPORTANT DATES TO REMEMBER

  • Program Committee Review Period: 25 November – 8 December at 11:59pm GMT Standard Time (UTC+0)
  • Track Chair Recommendations Period: 9-19 December
  • Co-Chair Selection Period + Schedule Building: 20 December – 10 January
  • Schedule Announced: Wednesday, 15 January
  • Event Dates: 1-4 April, 2025

Program Committee Responsibilities

CNCF ambassadors and maintainers across various projects are reached out to apply for the Program Committee. To ensure diversity in the committee and mitigate bias, the PC members are carefully selected by the program co-chairs taking into account various factors, like their specific interest and domain expertise. Members are also selected from a diverse range of backgrounds, considering gender diversity, geodiversity, company diversity and project diversity.

Program Committee members carefully evaluate 75-125 proposals in their assigned topic for each event. They are limited to participating in a maximum of two events per year.

  • Program committee members will receive an invitation to Sessionize as an evaluator Monday, 18 November. Please accept this invitation as soon as possible.
    Important: you must accept this invitation by Sunday, 24 November at 11:59pm GMT Standard Time (UTC+0) otherwise we cannot add you to the evaluation plan and you will not be able to participate.
  • The evaluation plan for your track will be activated in Sessionize by end of day 25 November.
  • Program Committee members will use Star Rating Evaluation Mode to complete reviews by the deadline.
  • Upon completion, your scores will be combined with your fellow program committee members in your track. The highest-scored 30% of submissions will move on to the track chairs for an additional evaluation.

Program Committee Benefits

To say thank you for your hard work, we are offering reviewers a tiered registration discount system:

  • 50% off Attendee Registration when reviewing 50-100 proposals
  • 100% off Attendee Registration when reviewing 101-200 proposals and $100 CNCF Online Store Gift Card redeemable at the online Cloud Native Store website.
  • Each member also receives a Program Committee Member digital badge from Credly.com to display on social channels.

Registration codes will be sent within a week after schedule announcement and will include your online Cloud Native Store Gift Card discount code. Please reach out to speakers@cncf.io if you have any questions regarding the benefits listed above.

Track Chair Responsibilities

For each track, two Track Chairs are chosen from the applicants for the Program Committee, with a preference for subject matter experts and those who have previous PC experience.

Track Chairs are not part of the program committee, enabling a fresh and unbiased perspective during the evaluation process. They are limited to participating as a track chair in one event per year.

  • Track Chairs will be provided a spreadsheet with the program committee’s scores and comments for their track, sorted from highest-scored to lowest-scored.
  • They will work together to approve roughly 30% of the sessions on their spreadsheet to move forward for the co-chair review stage.
  • As experts in their track, comments by track chairs are mandatory for each session approval to assist the co-chairs with their final selections.
  • It is important that Track Chairs work together to complete their recommendations by the deadline as there is no room for an extension.

Track Chair Benefits

To say thank you for your hard work, we are offering Track Chairs:

  • 100% off Attendee Registration and $200 CNCF Online Store Gift Card redeemable at the online Cloud Native Store website.
  • Each track chair will also receive a Track Chair digital badge from Credly.com to display on social channels.

Registration codes will be sent within a week after schedule announcement and will include your online Cloud Native Store Gift Card discount code. Please reach out to speakers@cncf.io if you have any questions regarding the benefits listed above.

REVIEW PROCESS + BEST PRACTICES

  • Time Commitment: Please plan on committing up to 8 hours total to review all of the submissions in your track, depending on the amount you have been assigned. Aim to do 10-15 sessions at a time – then take a break / walk away. This helps prevent burnout and allows you to see more proposals with fresh eyes.
  • Process Integrity: It is very important to protect the integrity of the review process, and to avoid undue bias, by keeping the submissions and your comments on them confidential. Please review and adhere to our Code of Conduct.
  • Public & Author Interaction: To ensure an unbiased review process, program committee members should not discuss submissions with authors and/or the overall public (i.e., please no tweeting). Of course, please feel free to tweet about accepted sessions that you are excited to attend once the schedule has been published.
  • Conflict of Interest: Reviewers are asked to wear their “KubeCon + CloudNativeCon” hats rather than the company or other affiliation when scoring submissions so that you rate all submissions fairly. If a submission was written by a colleague you work closely with or someone that you are seen to be associated with or in competition with, please choose to skip the question.
  • Topic Re-Routing: If you believe a talk would be better suited in a different topic, please leave a comment so we can move it to the correct track.
  • Experience Level: Use your expert knowledge to assess the experience level for the audience to understand the presentation. If you feel the presentation is not the experience level the speaker indicated, please leave a comment to indicate which experience would be a better fit.
  • Speakers with multiple submissions: Speakers are allowed to be listed on a maximum of one session and one panel. If you are in the position of reviewing more proposals from the same speaker than allowed, please indicate this in the comment box and continue scoring.
  • Panel Discussions: The ideal panel is composed of diverse thought leaders who talk about 80% of the time with 20% audience interaction. Some things to keep in mind when reviewing a panel submission:
    • Is the panel diverse, is there a mix of gender on the panel? Note for all CNCF Events: All sessions with 3+ speakers must have at least one speaker who does not identify as a man.
    • Is the submission cohesive and does it provide a clear view of how the panel would progress for 35 minutes? Could they cover everything within the proposal in the given 35 minutes?
    • Have they included any sample questions?
    • Does the panel include panelists from different organizations, including the moderator?
    • Research the panelists and moderator, if needed. Is their experience relevant to the topic?
    • Will the panelists provide diverse perspectives or will they repeat the same thing four times?
    • Are there any high-profile panelists?
    • In the instance that 1-2 of the panelists are unable to attend how would it impact the panel?
  • Breakout Sessions: A presentation is delivered by a topic expert with a fresh or unique point of view. Some things to keep in mind when reviewing presentation proposals:
    • Is the submission well written?
    • Is the topic relevant, original and are they considered to be subject matter experts?
    • Are they talking about a specific product from their company? If so, is it engaging in a way that is not advertorial? Keep in mind that sessions that come across as a pitch or infomercial for their company are most often rated very poorly among the audience.
    • Who is their target audience? Does the abstract and description match up with the expertise required?

STAR RATING EVALUATION MODE

Screenshot of Star Rating Mode
The progress bar updates in real-time, letting you know how far in the evaluation process you’ve come

The Star rating evaluation mode is highly user-friendly, requiring only an examination of the session information and a rating between one to five stars. Additionally, half-star ratings (0.5, 1.5, 2.5, 3.5, 4.5) are also available for use. Upon completion, simply click the Save and Continue button to confirm your rating and proceed to the next session.

Screenshot of Star Rating Overview

KubeCon + CloudNativeCon utilizes a Stars rating evaluation plan that involves multiple criteria. As an evaluator, you will be required to assess sessions based on the following four criteria, as opposed to assigning a singular overall rating:

  • Content: The relevance and coherence of the session’s content, the quality of the proposal, and the likelihood of effective delivery by the speaker.
  • Originality: The degree to which the session presents new and innovative ideas or approaches, as well as the originality of its delivery.
  • Relevance: The extent to which the session’s content provides new and exciting insights or information that is relevant to the conference.
  • Speaker(s): The suitability of the proposed speaker(s) based on their expertise and alignment with the subject matter.

In addition, it is mandatory to provide feedback in the form of a comment for each session. It is important to ensure that feedback is constructive, especially for rejected proposals, as submitting authors may range from a VP at a large company to a university student. Constructive feedback may include highlighting the positive aspects of a proposal, offering helpful suggestions, and providing factual feedback.

It is crucial to avoid direct attacks and instead focus on objective feedback that can help improve the proposal. Moreover, we strongly advise against using vague comments like “Scoring was tough, I had to cut this” or “LGTM.” Instead, it is essential to provide thoughtful and insightful comments that will assist the co-chairs in making their final selections.

Once you have finished rating a session based on the criteria and provided constructive feedback, click on Save and continue to proceed to the next session. Please note that if your comments are deemed unconstructive, you may not be invited to serve as a program committee member in the future.

Screenshot of how to save and continue later

If you have a hard time coming up with a decision about a certain session, you have the option to skip it and come back to it later. Simply click on the arrow flanking the Save and continue button to expand the appropriate menu and select the Skip and ask later option. This can be particularly useful if you have only just started with the evaluation process and would like to get a better sense of the overall quality of the submitted sessions.

In case a certain session covers a subject you’re completely unfamiliar with or poses a conflict of interest, click on the Ignore this session button. The evaluation system won’t ask you about that session anymore.

Track your progress

Screen shot of the evaluation bar

During the evaluation process, a progress bar will be displayed at the top of the page, providing an indication of your progress. If, at any point, you need to pause the evaluation process, click on the Stop and continue later button located above the progress bar. Upon returning, you can resume the evaluation from where you left off.

On the right is a box with several useful tabs. The default tab is Recent. You can use it to keep track of your past session evaluations, but it has an additional purpose: you can click on any of the sessions to reopen them and potentially change your evaluation.

Here’s a complete overview of the tabs found in the aforementioned box:

  • Recent – a list of sessions you recently evaluated
  • Speaker – see other sessions submitted by the same speaker (assuming they exist)
  • Similar – browse similar sessions
  • Search – look through all nominated sessions
Screen grab of the four tab options.
Keep track of your past evaluations and reevaluate the ones you aren’t happy with

Complete the evaluation and view your stats

Screenshot of star rating dashboard stats
Change your evaluation even after you’ve completed the initial process

Once you’re done with the evaluation, you’ll automatically be redirected to the Evaluation page. By opening the evaluation plan you’ve just completed, you can view your statistics, as well as potentially change your mind on any of the sessions by clicking on the corresponding edit button.

Your scores will be combined with other members in your review track and the top 30% will be moved to the next stage of evaluation.

CNCF Program Committee + Event Volunteer Guidelines

Volunteers who help with the planning, organization, or production of a CNCF event are often seen as representatives of the CNCF community or CNCF project that the event relates to, and their actions can meaningfully impact participants’ experience and perception of the event.  Therefore, and in the interest of fostering an open, positive, and welcoming environment for our community, it’s important that all event volunteers hold themselves to a high standard of professional conduct as described below.

These guidelines apply to a volunteer’s conduct and statements that relate to or could have an impact on any CNCF event that the volunteer helps plan, organize, select speakers for, or otherwise serve as a volunteer for.  These guidelines apply to relevant conduct occurring before, during, and after the event, both within community spaces and outside such spaces (including statements from personal social media accounts), and to both virtual and physical events. In addition to these guidelines, event volunteers must also comply with The Linux Foundation Event Code of Conduct and the CNCF Code of Conduct.

Be professional and courteous

Event volunteers will:

  • Conduct themselves in a professional manner suitable for a workplace environment;
  • Treat other event participants (including speakers, sponsors, exhibitors, attendees, volunteers, and staff) with courtesy and kindness; and
  • In their event-related communications, express their opinions in a courteous and respectful manner, even when disagreeing with others, and refrain from using obscenities, insults, rude or derisive language, excessive profanity, or other unprofessional language, images, or content.

Express feedback constructively, not destructively

The manner in which event volunteers communicate can have a large impact personally and professionally on others in the community. Event volunteers should strive to provide feedback or criticism relating to the event or any person or organization’s participation in the event in a constructive manner that supports others in learning, growing, and improving (e.g., offering suggestions for improvement).  Event volunteers should avoid providing feedback in a destructive or demeaning manner (e.g., insulting or publicly shaming someone for their mistakes).

Be considerate when choosing communication channels

Event volunteers should be considerate in choosing channels for communicating feedback.  Positive or neutral feedback may be communicated in any channel or medium.  In contrast, criticism about any individual event participant, staff member, or volunteer should be communicated in one or more private channels (rather than publicly) to avoid causing unnecessary embarrassment.  Criticism about an event that is not about specific individuals may be expressed privately or publicly, so long as it is expressed in a respectful, considerate, and professional manner.

Treat sensitive data confidentially and with respect

Event volunteers may have access to details about proposed or accepted speakers and the contents of their talks. They are required to adhere to The Linux Foundation’s guidelines regarding use of this information and may only use it for the purpose of choosing talks for an event. They are prohibited from using this data for any other purpose, including but not limited to the following:

  1. Using the information for unrelated business purposes
  2. Contacting speakers for any purpose other than evaluating their submission for this event
  3. Asking a submitter to speak at another event or recruiting them for another role
  4. Sharing the information with anyone outside of the program committee or sharing the acceptance of the talk prior to the schedule and abstract being announced

Changes to These Guidelines and Consequences for Noncompliance 

The event organizers may update these guidelines from time to time, and will notify volunteers by email and via the CNCF Slack channels designated for event volunteersHowever, any changes to these guidelines will not apply retroactively.  If the Linux Foundation Events team determines that a volunteer has violated these guidelines or The Linux Foundation Event Code of Conduct, it may result in the volunteer’s immediate suspension or removal from any event-related volunteer positions they hold, including participation in event-related committees. If these guidelines are updated and a volunteer does not wish to agree, their participation in the event-related volunteer position will cease until such time as they do agree.

CONTACT US

If you require any assistance reviewing proposals or have questions about the review process or any of the best practices we have suggested, please contact us for assistance.

Sponsors

Diamond

Platinum

Gold

Silver

Start-up