KubeCon + CloudNativeCon North America
""

Submission Reviewer Guidelines

OVERVIEW

Thank you in advance for your efforts as a submission reviewer for KubeCon + CloudNativeCon North America.

Below you will find guides to Sessionize’s evaluation modes in addition to best practices to use when reviewing your set of proposals. Please bookmark this page for easy reference. If you have any questions, please email the CNCF Content Team.

By participating as a reviewer, you agree to the CNCF Program Committee and Events Volunteer Guidelines.

IMPORTANT DATES TO REMEMBER

  • Program Committee Review Period: June 10 – 21
  • Track Chair Recommendations Period: June 24 – July 7
  • Co-Chair Selection Period + Schedule Building: July 8 – August 4
  • Schedule Announced: Wednesday, August 14
  • Event Dates: November 12-15, 2024

Program Committee Responsibilities

CNCF ambassadors and maintainers across various projects are reached out to apply for the Program Committee. To ensure diversity in the committee and mitigate bias, the PC members are carefully selected by the program co-chairs taking into account various factors, like their specific interest and domain expertise. Members are also selected from a diverse range of backgrounds, considering gender diversity, geodiversity, company diversity and project diversity.

Program Committee members carefully evaluate 75-125 proposals in their assigned topic for each event. They are limited to participating in a maximum of two events per year.

  • Program committee members will receive an invitation to Sessionize as an evaluator the week of June 3. Please accept this invitation as soon as possible.
    Important: you must accept this invitation by Sunday, June 9 at 11:59pm PDT otherwise we cannot add you to the evaluation plan and you will not be able to participate.
  • The evaluation plan for your track will be activated in Sessionize by end of day June 10.
  • Program Committee members will use Star Rating Evaluation Mode to complete reviews by the deadline.
  • Upon completion, your scores will be combined with your fellow program committee members in your track. The highest-scored 30% of submissions will move on to the track chairs for an additional evaluation.

Program Committee Benefits

To say thank you for your hard work, we are offering reviewers a tiered registration discount system:

  • 50% off Attendee Registration when reviewing 50-100 proposals
  • 100% off Attendee Registration when reviewing 101-200 proposals and $100 CNCF Online Store Gift Card redeemable at the online Cloud Native Store website.
  • Each member also receives a Program Committee Member digital badge from Credly.com to display on social channels.

Registration codes will be sent within a week after schedule announcement and your CNCF Store Gift Card will be sent separately closer to the event. Please reach out to speakers@cncf.io if you have any questions regarding the benefits listed above.

Track Chair Responsibilities

For each track, two Track Chairs are chosen from the applicants for the Program Committee, with a preference for subject matter experts and those who have previous PC experience.

Track Chairs are not part of the program committee, enabling a fresh and unbiased perspective during the evaluation process. They are limited to participating as a track chair in one event per year.

  • Track Chairs will be provided a spreadsheet with the program committee’s scores and comments for their track, sorted from highest-scored to lowest-scored.
  • They will work together to approve roughly 30% of the sessions on their spreadsheet to move forward for the co-chair review stage.
  • As experts in their track, comments by track chairs are mandatory for each session approval to assist the co-chairs with their final selections.
  • It is important that Track Chairs work together to complete their recommendations by the deadline as there is no room for an extension.

Track Chair Benefits

To say thank you for your hard work, we are offering track chairs:

  • 100% off Attendee Registration and $200 CNCF Online Store Gift Card redeemable at the online Cloud Native Store website.
  • Each track chair will also receive a Track Chair digital badge from Credly.com to display on social channels.

Registration codes will be sent within a week after schedule announcement and your CNCF Store Gift Card will be sent separately closer to the event. Please reach out to speakers@cncf.io if you have any questions regarding the benefits listed above.

REVIEW PROCESS + BEST PRACTICES

  • Time Commitment: Please plan on committing up to 8 hours total to review all of the submissions in your track, depending on the amount you have been assigned. Aim to do 10-15 sessions at a time – then take a break / walk away. This helps prevent burnout and allows you to see more proposals with fresh eyes.
  • Process Integrity: It is very important to protect the integrity of the review process, and to avoid undue bias, by keeping the submissions and your comments on them confidential. Please review and adhere to our Code of Conduct.
  • Public & Author Interaction: To ensure an unbiased review process, program committee members should not discuss submissions with authors and/or the overall public (i.e., please no tweeting). Of course, please feel free to tweet about accepted sessions that you are excited to attend once the schedule has been published.
  • Conflict of Interest: Reviewers are asked to wear their “KubeCon + CloudNativeCon” hats rather than the company or other affiliation when scoring submissions so that you rate all submissions fairly. If a submission was written by a colleague you work closely with or someone that you are seen to be associated with or in competition with, please choose to skip the question.
  • Topic Re-Routing: If you believe a talk would be better suited in a different topic, please leave a comment so we can move it to the correct track.
  • Experience Level: Use your expert knowledge to assess the experience level for the audience to understand the presentation. If you feel the presentation is not the experience level the speaker indicated, please leave a comment to indicate which experience would be a better fit.
  • Speakers with multiple submissions: Speakers are allowed to be listed on a maximum of one session and one panel. If you are in the position of reviewing more proposals from the same speaker than allowed, please indicate this in the comment box and continue scoring.
  • Panel Discussions: The ideal panel is composed of diverse thought leaders who talk about 80% of the time with 20% audience interaction. Some things to keep in mind when reviewing a panel submission:
    • Is the panel diverse, is there a mix of gender on the panel? Note for all KubeCon + CloudNativeCon Events: All panels are required to have at least one speaker who does not identify as a man.
    • Is the submission cohesive and does it provide a clear view of how the panel would progress for 35 minutes? Could they cover everything within the proposal in the given 35 minutes?
    • Have they included any sample questions?
    • Does the panel include panelists from different organizations, including the moderator?
    • Research the panelists and moderator, if needed. Is their experience relevant to the topic?
    • Will the panelists provide diverse perspectives or will they repeat the same thing four times?
    • Are there any high-profile panelists?
    • In the instance that 1-2 of the panelists are unable to attend how would it impact the panel?
  • Breakout Sessions: A presentation is delivered by a topic expert with a fresh or unique point of view. Some things to keep in mind when reviewing presentation proposals:
    • Is the submission well written?
    • Is the topic relevant, original and are they considered to be subject matter experts?
    • Are they talking about a specific product from their company? If so, is it engaging in a way that is not advertorial? Keep in mind that sessions that come across as a pitch or infomercial for their company are most often rated very poorly among the audience.
    • Who is their target audience? Does the abstract and description match up with the expertise required?

STAR RATING EVALUATION MODE

Screenshot of Star Rating Mode
The progress bar updates in real-time, letting you know how far in the evaluation process you’ve come

The Star rating evaluation mode is highly user-friendly, requiring only an examination of the session information and a rating between one to five stars. Additionally, half-star ratings (0.5, 1.5, 2.5, 3.5, 4.5) are also available for use. Upon completion, simply click the Save and Continue button to confirm your rating and proceed to the next session.

Screenshot of Star Rating Overview

KubeCon + CloudNativeCon utilizes a Stars rating evaluation plan that involves multiple criteria. As an evaluator, you will be required to assess sessions based on the following four criteria, as opposed to assigning a singular overall rating:

  • Content: The relevance and coherence of the session’s content, the quality of the proposal, and the likelihood of effective delivery by the speaker.
  • Originality: The degree to which the session presents new and innovative ideas or approaches, as well as the originality of its delivery.
  • Relevance: The extent to which the session’s content provides new and exciting insights or information that is relevant to the conference.
  • Speaker(s): The suitability of the proposed speaker(s) based on their expertise and alignment with the subject matter.

In addition, it is mandatory to provide feedback in the form of a comment for each session. It is important to ensure that feedback is constructive, especially for rejected proposals, as submitting authors may range from a VP at a large company to a university student. Constructive feedback may include highlighting the positive aspects of a proposal, offering helpful suggestions, and providing factual feedback.

It is crucial to avoid direct attacks and instead focus on objective feedback that can help improve the proposal. Moreover, we strongly advise against using vague comments like “Scoring was tough, I had to cut this” or “LGTM.” Instead, it is essential to provide thoughtful and insightful comments that will assist the co-chairs in making their final selections.

Once you have finished rating a session based on the criteria and provided constructive feedback, click on Save and continue to proceed to the next session. Please note that if your comments are deemed unconstructive, you may not be invited to serve as a program committee member in the future.

Screenshot of how to save and continue later

If you have a hard time coming up with a decision about a certain session, you have the option to skip it and come back to it later. Simply click on the arrow flanking the Save and continue button to expand the appropriate menu and select the Skip and ask later option. This can be particularly useful if you have only just started with the evaluation process and would like to get a better sense of the overall quality of the submitted sessions.

In case a certain session covers a subject you’re completely unfamiliar with or poses a conflict of interest, click on the Ignore this session button. The evaluation system won’t ask you about that session anymore.

Track your progress

Screen shot of the evaluation bar

During the evaluation process, a progress bar will be displayed at the top of the page, providing an indication of your progress. If, at any point, you need to pause the evaluation process, click on the Stop and continue later button located above the progress bar. Upon returning, you can resume the evaluation from where you left off.

On the right is a box with several useful tabs. The default tab is Recent. You can use it to keep track of your past session evaluations, but it has an additional purpose: you can click on any of the sessions to reopen them and potentially change your evaluation.

Here’s a complete overview of the tabs found in the aforementioned box:

  • Recent – a list of sessions you recently evaluated
  • Speaker – see other sessions submitted by the same speaker (assuming they exist)
  • Similar – browse similar sessions
  • Search – look through all nominated sessions
Screen grab of the four tab options.
Keep track of your past evaluations and reevaluate the ones you aren’t happy with

Complete the evaluation and view your stats

Screenshot of star rating dashboard stats
Change your evaluation even after you’ve completed the initial process

Once you’re done with the evaluation, you’ll automatically be redirected to the Evaluation page. By opening the evaluation plan you’ve just completed, you can view your statistics, as well as potentially change your mind on any of the sessions by clicking on the corresponding edit button.

Your scores will be combined with other members in your review track and the top 30% will be moved to the next stage of evaluation.

CNCF Program Committee + Event Volunteer Guidelines

Volunteers who help with the planning, organization, or production of a CNCF event are often seen as representatives of the CNCF community or CNCF project that the event relates to, and their actions can meaningfully impact participants’ experience and perception of the event. Therefore, and in the interest of fostering an open, positive, and welcoming environment for our community, it’s important that all event volunteers hold themselves to a high standard of professional conduct as described below.

These guidelines apply to a volunteer’s conduct and statements that relate to or could have an impact on any CNCF event that the volunteer helps plan, organize, select speakers for, or otherwise serve as a volunteer for. These guidelines apply to relevant conduct occurring before, during, and after the event, both within community spaces and outside such spaces (including statements from personal social media accounts), and to both virtual and physical events. In addition to these guidelines, event volunteers must also comply with The Linux Foundation Event Code of Conduct and the CNCF Code of Conduct.

Be professional and courteous

Event volunteers will:

  • Conduct themselves in a professional manner suitable for a workplace environment;
  • Treat other event participants (including speakers, sponsors, exhibitors, attendees, volunteers, and staff) with courtesy and kindness; and
  • In their event-related communications, express their opinions in a courteous and respectful manner, even when disagreeing with others, and refrain from using obscenities, insults, rude or derisive language, excessive profanity, or other unprofessional language, images, or content.

Express feedback constructively, not destructively

The manner in which event volunteers communicate can have a large impact personally and professionally on others in the community. Event volunteers should strive to provide feedback or criticism relating to the event or any person or organization’s participation in the event in a constructive manner that supports others in learning, growing, and improving (e.g., offering suggestions for improvement). Event volunteers should avoid providing feedback in a destructive or demeaning manner (e.g., insulting or publicly shaming someone for their mistakes).

Be considerate when choosing communication channels

Event volunteers should be considerate in choosing channels for communicating feedback. Positive or neutral feedback may be communicated in any channel or medium. In contrast, criticism about any individual event participant, staff member, or volunteer should be communicated in one or more private channels (rather than publicly) to avoid causing unnecessary embarrassment. Criticism about an event that is not about specific individuals may be expressed privately or publicly, so long as it is expressed in a respectful, considerate, and professional manner.

Changes to These Guidelines and Consequences for Noncompliance 

The event organizers may update these guidelines from time to time, and will notify volunteers by email and via the CNCF Slack channels designated for event volunteers. However, any changes to these guidelines will not apply retroactively. If the Linux Foundation Events team determines that a volunteer has violated these guidelines or The Linux Foundation Event Code of Conduct, it may result in the volunteer’s immediate suspension or removal from any event-related volunteer positions they hold, including participation in event-related committees. If these guidelines are updated and a volunteer does not wish to agree, their participation in the event-related volunteer position will cease until such time as they do agree.

CNCF Program Committee + Event Volunteer Guidelines

CONTACT US

If you require any assistance reviewing proposals or have questions about the review process or any of the best practices we have suggested, please contact us for assistance.

Sponsors

DIAMOND

Platinum

gold

silver

Start-up

Diversity Supporters

Media Partners