Notes
1 Front Matter
1.1 Open Licensing and Adaptation Guidance
Organizations such as state, consortial, or institutional affordable learning and OER programs are encouraged to adapt this SOP to fit their own processes and contexts. This document is open licensed under a Creative Commons Attribution 4.0 International license.
Affordable Learning Georgia (ALG) encourages organizations to collaborate on the creation of a more universally applicable SOP template for OER program management. This SOP may assist in getting a template started, but ALG’s context within a state system may diminish the SOP’s compatibility with all varieties of affordable learning and OER programs.
SOP chapters will follow a hierarchical outline structure from the highest-level major process to the lowest-level procedures. Each chapter will focus on a major ALG process; chapter length will vary. This structure was largely selected for the document to be compatible with as many systems as possible and customizable by other entities for their own use. A hierarchical text structure also ensures accessibility: screen readers will have the proper text and headings to convey all information.
1.2 Authorship and Contact Information
This SOP was authored in Markdown using Obsidian. ALG chose Markdown for its simplicity and flexibility, along with its compatibility with Manifold, the platform behind OpenALG. No generative AI was used in the authorship of this document; if this changes, we will disclose any generative AI use.
Please contact us at [email protected] with any questions, errata, edits, or possible additions. For more information about ALG, our website is located at https://affordablelearninggeorgia.org.
2 Context and Purpose of the SOP
2.1 Standards and Vocabulary
For this Standard Operating Procedures (SOP) document, ALG will follow the Terms and Definitions section of the quality management systems standard ISO 9000:2015 as authority on naming and structuring fundamental process and organizational terms. ISO 9000:2015 will soon be replaced by an updated version, ISO/DIS 9000, currently in the Under Development stage at the time of this publication. A new edition of this SOP addressing ISO/DIS 9000 will be created if changes within the standard affect this document.
A controlled vocabulary allows ALG to structure its work clearly and convey this structure to similar affordable learning programs. In defining our vocabulary using ISO 9000:2015, ALG can clarify the systems and processes surrounding the organization and within it.
2.1.1 Process
A process is a set of interrelated or interacting activities that use inputs to deliver an intended result.
ISO standards do not specify a hierarchy of processes. For the purposes of this SOP:
- A major process is a large-scale recurring process central to achieving a result.
- Example: Grant selection is one major process within ALG.
- A process is a set of procedures central to completing and sustaining a major process.
- Example: Processes within grant selection include launching a round of grants, selecting peer reviewers, reviewing applications, and awarding selected projects. ### 2.1.2 Procedure
A procedure is a particular way to carry out a process or part of a process.
In the SOP, unlike a major process or a process, procedures will not be labeled as "Procedure: (title of procedure)" as they are too numerous to label. Procedures are instead hierarchically listed under their processes.
2.1.3 Project
A project is a unique process consisting of a set of coordinated and controlled activities with start and finish dates undertaken to achieve an objective. A project conforms to specific requirements including time constraints, cost, and resources.
2.1.4 Objective
An objective is a result to be achieved.
ISO standards do not specify a hierarchy of objectives. For the purposes of this SOP:
A major objective is a (or the) main result of the organization to be achieved, such as (or contained within) a mission statement. - Example: ALG’s mission statement is “Affordable Learning Georgia promotes student success and levels the playing field for all students through supporting the adoption, adaptation, and creation of affordable and open educational resources by University System of Georgia (USG) faculty and professional staff.” In this statement, the major objective is defined alongside the major processes necessary to reach the objective.
A sub-objective is an objective serving an objective higher in a hierarchical list of objectives.
2.1.5 Organization
An organization is a group of people that has its own functions with responsibilities, authorities, and relationships to achieve its objectives.
2.1.6 Strategy
A strategy is a plan to achieve a long-term or overall objective.
Strategic objectives are objectives that, when achieved, contribute to a strategy.
A strategic plan is a long-term plan to achieve multiple strategies for an organization.
2.1.7 Policy
Policy is the intentions and direction of an organization as expressed by its top management (e.g. Executive Director, Vice Chancellor, Chancellor).
2.1.8 Management
Management is the set of coordinated activities to direct and control an organization. Management can include establishing policies and objectives, as well as processes to achieve these objectives.
2.1.9 Top Management and Top Executive
Top Management or Top Executives direct and control an organization at the highest level.
2.1.10 Stakeholder
A stakeholder is a person or organization that can affect, be affected by, or perceive itself to be affected by a decision or activity. Stakeholders could include anyone inside or outside of an organization.
2.1.11 Quality
Quality is the degree to which a set of inherent characteristics of an object fulfills its requirements.
2.1.11.1 Note on Quality
ISO 9000:2015 specifically uses the phrase "inherent characteristics" in its definition of quality: pertaining only to essential, permanent characteristics within the basic nature of the object itself. The object - whether that is an organization, a service, a textbook, or even one photo, retains inherent characteristics no matter who uses and perceives them.
In this ISO definition, "requirements" are the variable element: they are where perception shapes the specific meaning of quality per instance.
2.1.12 Requirement
A requirement is a need or expectation that is stated, generally implied or obligatory.
2.1.12.1 Note on Requirements as the Determinant of Quality
ISO 9000:2015 has six notes under the definition of requirement; the only other terms with six notes, the maximum number of notes in their Terms and Definitions, are the equally important terms "process" and "risk." Requirements can be implied through a field's common practice or obligatory through laws, rules, regulations, and policy.
In the education world, particularly in a system with multiple types of institutions with various interrelated missions, the word "quality" can often be applied to resources or the process of instruction in a subjective, case-by-case way. Quality Matters (QM) is a good example of an organization addressing instructional quality. The official glossary of Quality Matters does not have a definition for the word "quality." However, QM does not need to define quality in their glossary; QM defines quality in specific ways through setting requirements in various contexts. QM provides detailed rubrics for meeting the needs of instructors and institutions who require excellence in online instruction; this is, in essence, quality assurance work.
2.1.12.2 Quality and Requirements in OER
The field of open education in particular has found through research a barrier in faculty perception due to mixed concerns about resource quality. Open and affordable education program leaders must respond, measure, and provide solutions on resource quality issue with extreme clarity, focusing on the two sides of the ISO quality definition: the interaction of instructional requirements of faculty and the inherent (but changeable through open licensing) characteristics of a resource:
- When raising awareness of the existence of OER as "high-quality," you are addressing the likelihood that the inherent characteristics of materials will meet faculty needs, not giving a promise that every resource will objectively meet the needs of every instructor.
- When assessing faculty evaluations of quality in educational resources, gather faculty requirements and assess if those specific requirements are being met. These requirements will differ by institutional setting, institutional mission, discipline, course, and instructor. Simply asking about the "quality" of a resource without context in qualitative or quantitative research on OER will yield extremely limited actionable findings. ### 2.1.13 Interchangeable Terms
Various state government agencies, divisions, and departments use different terms for a hierarchy of organizations and a hierarchy of objectives. To clarify this for the reader, this document will try to stick to the ISO terminology, but may use similar hierarchical terms out of necessity:
2.1.13.1 Organizations, Initiatives, and Programs
ISO 9000:2015 is built for management systems within all types of organizations, from large corporations to small nonprofits. Not all ISO language will apply to ALG’s work; ISO 9000:2015 does not have a formal definition of “program” and ALG is classified as a program within GALILEO. GALILEO is classified as an initiative of its higher-level organization, the USG. The terms “initiative” and “program” will sometimes be used interchangeably with the term “organization” when defining the organizational hierarchy in these documents.
2.1.13.2 Objectives, Goals, and Strategies
In official documents for ALG and higher-level organizations, the aspects of an organization that the ISO standard defines as “strategies” and “objectives” might vary: “goals,” “strategic goals,” and “tactical goals” are sometimes used interchangeably and/or in an arbitrary hierarchy in strategic plans. The 2029 USG Strategic Plan is organized in a hierarchy of “goals” as major objectives and “initiatives” as processes to achieve the objectives. The 2013-2018 USG Strategic Plan, in contrast, was organized in a hierarchy of “strategic imperatives” as major objectives and “actions” as processes to achieve the objectives.
2.2 Organizational Context
2.2.1 Affordable Learning Georgia within the USG
Affordable Learning Georgia is a program of GALILEO, Georgia’s Virtual Library, an initiative of the University System of Georgia (USG) and part of the Library Services department in the Academic Affairs division of the USG.
ALG is structured in a vertical hierarchy within five organizations and aligned with multiple layers of organizational policy:
- Government: State of Georgia Government: State of Georgia Rules and Regulations
- Top Executive: Governor
- System: University System of Georgia: USG Board of Regents Policy Manual
- Top Executives: Board of Regents and Chancellor
- Division: Division of Academic Affairs: Academic and Student Affairs Handbook
- Top Executive: Executive Vice Chancellor for Academic Affairs
- Department: Department of Library Services (Different policies per GPLS, Archives, or GALILEO)
- Top Executive: (Associate/Assistant) Vice Chancellor for Library Services
- GALILEO: GALILEO Vision and Mission, GALILEO Policies
- Top Executive: Executive Director, GALILEO
- Program: Affordable Learning Georgia: Mission and Values
- Top Management: Program Director, Affordable Learning Georgia
The USG is a governing and management authority for all public higher education institutions within the state of Georgia. The USG manages a system of 25 institutions within four classifications: Research Universities, Comprehensive Universities, State Universities, and State Colleges. The USG also includes all statewide library programs: GALILEO, the Georgia Public Library Service, and the Georgia Archives. Learn more about the structure and history of the organization on the USG website’s USG Facts page.
As a GALILEO program within the USG, ALG’s highest-level objective is to promote student success throughout our institutions through equal access to required course materials. ALG’s highest-level processes to serve this highest-level objective are intended to support the adoption, adaptation, and creation of affordable and open educational resources by USG faculty and professional staff. Data is then gathered and analyzed to determine whether ALG meets this major objective. The reports from this analysis are shared with the public, and customized versions of these reports are send to the Vice President for Academic Affairs (VPAA, often the Provost) within each institution to ensure that both the governing organization and all institutions are informed about ALG’s role and each institution’s role in keeping materials costs affordable; thereby assisting with equal access to materials, thereby contributing to the success of more students at each institution.
An optimal organizational objective aligns through all five organizations in the hierarchy listed above: ALG’s objectives help GALILEO accomplish a portion of its objectives, while GALILEO’s objectives help the USG Department of Library Services and the USG Division of Academic Affairs accomplish a portion of their objectives, serving the University System of Georgia’s Strategic Plan, the USG therefore helping to accomplish educational and career objectives set by the State of Georgia Government.
2.2.2 Management Processes External to ALG
ALG benefits from critical management processes within other areas of the larger USG organization. Because these processes take place outside of ALG, they will not be described in detail within our SOP documentation, but they may be mentioned as they interact with internal processes. These processes include:
- State of Georgia: state funding, state-level policies, rules and regulations, laws
- USG Board of Regents and Chancellor: system-level policies, system-level funding to all institutions and the system office, top-level approvals
- USG Division of Academic Affairs and Department of Library Services (GALILEO): academic strategy, faculty affairs, web services, library resource management, contract management, performance management, executive oversight, interdepartmental and executive outreach
- USG Division of Operations: human resources, legal affairs and copyright, information technology, cybersecurity, research and policy analysis (institutional data), safety and security
- USG Division of Fiscal Affairs: accounting, budgeting, risk management, finance, procurement
- USG Division of External Affairs: systemwide and statewide communications, governmental relations
- USG Division of Internal Audit, Ethics, & Compliance: resolution of ethical issues within USG institutions and the system office
Without the assistance of the state and USG departments with these processes, ALG would need far more staff with various specializations to function. This is especially important when addressing which management processes reside within ALG, leading to which processes are listed in this SOP and which are not.
2.2.3 The Working Manager Model
ALG strives to balance a fiscally responsible, small, and agile team while encouraging taking initiative and growing through professional development. The ALG Program Director and Program Manager (“ALG Staff”) serve functionally as working managers: managers who direct organizations while also performing operational tasks aimed at achieving the organization’s objectives. In one workday, ALG staff may be leading national workgroups or presenting on panels with other experts in the field, followed by repetitive spreadsheet work, then managing and entering metadata, then fixing a bug on the ALG website, and then reformatting and sending multiple documents for signatures, all interspersed with timely one-to-one email responses to stakeholders across the system office and its institutions.
While these operational tasks are numerous, ALG staff are assisted directly by leaders and staff within GALILEO on specific functions. When outreach stretches beyond the immediate ALG community of ALG Champions, grants offices, and grantees, ALG works with the Assistant Director of Marketing and Communications of GALILEO on processes such as system-wide marketing, events, and in some rare cases, editing and sending a press release to USG Communications for approval and distribution. When resolving an issue that stretches beyond the boundaries of GALILEO, the Executive Director of GALILEO works with ALG to liaise with other departments and other executive leaders. The Executive Director of GALILEO also assists ALG in setting strategic direction, ensuring processes and objectives are aligned with GALILEO and that GALILEO can assist ALG effectively in achieving any new objectives. As mentioned in the list of management processes external to ALG, other levels of the overall organization manage many day-to-day organizational operations for the entire university system including ALG and GALILEO.
2.2.3.1 Advantages and Disadvantages
The working manager model ultimately allows ALG to respond quickly and effectively to the needs of GALILEO and the USG as they arise. An ALG staff member requires levels of knowledge ranging from operational to strategic when answering complex questions from stakeholders with a level of timeliness expected and needed within this role. Collaborating and working directly on all operational tasks helps ALG staff visualize any issues encountered by stakeholders and conceptualize novel solutions when more routine solutions cannot suffice, and in a newer field such as affordable and open education, long-established models do not yet exist and routine solutions frequently do not suffice. Effective and expedient customer and/or stakeholder service in a growing and changing field is therefore a standout strength of a functioning working manager model and a strength of ALG.
The working manager model also has a standout vulnerability: intermittent high demands for operational tasks have the potential to stifle any professional growth or organizational innovation if the workload increases anywhere beyond its current level. Too many repetitive operational demands may bury a managerial role in tasks a traditional “pure” or non-working manager would not be assumed to perform. Decisions on adding or removing positions from ALG or a restructuring of this organization should take this difficult balance between operational tasks and management into account.
2.2.4 ALG Stakeholders
2.2.4.1 Major Stakeholders
Major stakeholders are supported by ALG programs through USG funding. Major stakeholders include: - State of Georgia: Georgia benefits from the University System of Georgia primarily through the academic achievement and subsequent employment of USG students. - USG Students: the direct recipients of the results of ALG grant projects and published OER. ALG as a program may work primarily with faculty, staff, and leadership, but all ALG processes serve the major objective of USG student success. - USG Executive Leadership: leaders who are working toward current objectives for the USG and set priorities for institutions and major departments. - USG Faculty: instructors and academic professionals who could adopt, adapt, or create materials through an ALG-supported project or through other means. - USG Staff: departments and professionals involved in providing affordable resources and creating effective teaching and learning materials.
Within these major stakeholder groups, ALG Champions are volunteers who offer their expertise, assistance, and advocacy for affordable and open materials at each institution. ALG Champions meet monthly with ALG staff and coordinate on-campus communications, grants assistance, and workshops in collaboration with ALG.
2.2.4.2 External Stakeholders
External stakeholders are individuals and organizations who are affected by ALG programs. External stakeholders include: - Non-USG Faculty: instructors and academic professionals outside of the USG who could adopt or adapt open materials published through an ALG-supported project. - Non-USG Students: students outside of the USG affected by the implementation of materials published through an ALG-supported project. - Peer Organizations and Conferences: State, consortial, and provincial organizations with similar or connected objectives and processes who could be affected by materials published by ALG, meetings with ALG staff, collaborations with ALG staff, and presentations by ALG staff. - Examples include Affordable Learning Solutions (California State University), Affordable Learning Louisiana (LOUIS), OERizona (statewide collaborative), USG Faculty Development, the USG Teaching and Learning Conference (USGTL), and the Georgia Library Conference (GLC). - Peer Individuals: Individuals working toward similar objectives with similar processes who could be affected by materials published by ALG, meetings with ALG staff, collaborations with ALG staff, and presentations by ALG staff. - Examples include contributors to the Open Education Forum listserv, members of the Open Education Association, and similar individuals in roles within peer organizations. - Field Organizations and Conferences: System, state, consortial, regional, and national organizations in the field of affordable and open education who could be affected by ALG participation and/or leadership. - Examples include the Open Education Association (OEA), Driving OER Sustainability for Student Success (DOERS), Open Education Global (OEGlobal), and the Open Education Conference (OpenEd).
2.3 Purpose of the SOP
2.3.1 Major Objective
The major objective of the creation, maintenance, and publication of the SOP is to create a full map of the processes and procedures of ALG, contributing to process improvement, agility and adaptability, increased accessibility, and a shared understanding of the complex and necessary work Affordable Learning Georgia does in service of organizational objectives.
This major objective has multiple sub-objectives.
2.3.2 Sub-Objectives
2.3.2.1 Guiding Process Improvement and Change Management
A full map of the procedures involved with a major process allows ALG to analyze the process at all levels and identify areas for improvement. Changes have already happened as a result of authoring the SOP: for example, in mapping out the preparation procedures taking place on the morning after a grant deadline, ALG was able to see where to separate peer review procedures, which were urgent, from administrative review procedures, which had more lead time. This allowed for ALG to send out applications for peer review more quickly while maintaining the same level of precision. The urgency in completing the administrative review-exclusive tasks simultaneously no longer applied; once peer reviews were sent, reviewers had what they needed, and administrative review preparations would happen soon after.
Upon mapping procedures step-by-step, ALG was surprised with how salient the need for an organizational context analysis was. An SOP directly answers the question "How do we do this?" However, the question "Why do we do this?" was innately embedded in every procedure's description as ALG mapped out each major process. Questions arose beyond "why" and beyond the organization as well: knowing what ALG is, what it does, what it has the potential to do, and what was in the domain of other departments and programs was crucial to mapping responsibilities with any collaborators outside of ALG staff. Organizational context, before authoring the SOP, seemed obvious and self-evident, but the authoring process illuminated how complex the underlying and overarching structures supporting ALG were, the illumination bringing with it an urgent need to define those structures clearly.
This enhanced structural knowledge resulted in a deeper understanding of ALG's work, hopefully empowering ALG management to be both more informed and resilient in the face of future changes, conflicts, and crises. In a quickly-changing world, ALG must be able to adapt at a moment's notice alongside its larger organizations. Through the authorship and maintenance of a SOP document, comprehensively understanding both the top-down organizational context of our objectives and the bottom-up procedures and processes aimed at achieving these objectives will allow ALG staff to move with various changes within and outside of the field of open and affordable education as agile and effective leaders. Changes in funding or a need for a change in scale at ALG, including any potential opportunities to grow the organization's impact and add new objectives, resources, and processes, will need both a high level of understanding and a comprehensive map of the organization's procedures in order to make informed decisions as these changes arise.
2.3.2.2 Increasing Accessibility
A full map of procedures allows ALG to analyze its work to increase accessibility in two ways: the accessibility of the resources we publish and the accessibility of work procedures.
Affordable Learning Georgia has pursued accessibility for all grant teams' published open educational resources in the past, but the federal requirement for all educational resources to be compatible with Title II prior to individual accommodations, active as of April 2026, has driven ALG to formalize both its accessibility processes and its documentation of these processes.
The creation process for this SOP has already led to accessibility changes in our grant review processes, eliminating procedures where color was used to convey information without an alternative. Further, a full step-by-step listing of procedures assists ALG staff in performing annual and bi-annual processes. For instance, preparing a round of grant applications for peer reviews no longer requires all the listed steps to be recalled by memory on the morning after an application deadline, decreasing the required cognitive load for staff with working memory-related disabilities.
2.3.2.3 Making Complex Work Understandable
Affordable Learning Georgia was one of the first programs of its kind to support OER and affordable learning at a systemwide level. Processes for affordable learning program processes, such as a grant program or cost savings data management, do not have software and platforms customized specifically for these purposes available. Processes take place by using common, flexible programs such as word processors, spreadsheets, and online survey forms. There is a level of customization that ALG must do with these common programs in order to achieve our objectives, involving knowledge in various domains for processes including data management, copyright, library and information science, and accessibility remediation.
ALG strives to keep all processes required of grantees, executive leaders, and volunteer ALG Champions as simple to perform as possible. Without specialized platforms to facilitate this, the burden of complexity is largely on ALG staff. For example, ALG cannot simply send raw data spreadsheets to stakeholders and expect them to be aware of affordability on their campus as a result: data needs to be cleaned, hosted, connected through a model, routed through a designed dashboard, and then exported as a report. That report, sent to executive leaders, is a simple PDF document, but the work leading to that simple document involves a high level of complexity to ALG.
Whenever complex and difficult work is critical to reducing the complexity for our stakeholders, ALG pursues more efficient and effective ways of accomplishing the work and meeting our objectives, but until further global technological progress, some of this work will continue to have a high level of difficulty and complexity. This complexity is not easily delegated, and training a new ALG staff member on this work requires extensive, careful onboarding.
In a complex work environment, a Standard Operating Procedures document allows for: - ALG staff to refer back to a source and understand their work; - ALG management to use as a primary source for the creation of staff training and onboarding materials; - direct supervisors to understand the work involved in an ALG staff position; - executive leaders to understand the need for funding staff time and providing resources; and - any member of the public to understand the organization's due diligence, including checking for accessibility and tracking impact through data management and analysis.
2.3.3 Note: Onboarding and Training
A Standard Operating Procedures document is not a substitute for proper onboarding and training procedures at ALG. Onboarding and training are thoughtfully-planned activities involving not only procedures and processes, but also the work environment, getting to know peers, and getting to know the culture and history of the larger organization. The SOP can be used to help create training and onboarding materials, but the SOP, in itself, is not intended to be a training document.
Further, the SOP should not be seen as a training document for new open and affordable education leaders outside of ALG.
3 Major Process: Grant Selection
3.1 Process: Launching a Grant Round
3.1.1 Creating a Timeline
3.1.1.1 Set the Application Lead Time.
- Applicants should have at least three months of lead time from the announcement of the Request for Proposals to the application deadline.
3.1.1.2 Set the Reviewer Recruitment date.
- This is the date when you will send an email to all ALG-interested faculty with a request for reviewers, along with a completed form.
- Set this date at least a month before the deadline for applications.
3.1.1.3 Set the Reviewer Recruitment deadline and Notification Date.
- This is when you will notify reviewers that they have been selected.
- Set the deadline about two weeks after the recruitment date.
3.1.1.4 Set the Review timeframe.
- To expedite the review process, peer reviews should start directly after the deadline.
- Peer and administrative reviews should take about two weeks – 13 or 14 days depending on where weekends fall.
- Expect to be available to answer technical questions during the peer review process.
3.1.1.5 Set the Notification Date.
- This is when all awarded Project Leads from proposals are notified. Allow most of this day for writing email and answering awardee questions.
- Allow enough time in your schedule to make an official webpage for all grantees and an official announcement in the Newsletter.
3.1.1.6 Set up the Online Synchronous Kickoff Meeting date.
- This day should be about two or three weeks from the Notification Date in order to allow grantees to confirm their projects and complete asynchronous training materials.
- Fridays have worked the best for grant teams in the past.
3.1.1.7 Confirm the grant timeline and reserve dates for ALG staff in Outlook Calendar.
- Create a Word document in the grant folder for the year (e.g. R26-R27, which is one fiscal year) with all timeline dates.
- Send dates to the Executive Director of GALILEO for approval.
- In the Outlook Calendar, add these dates and date ranges (for the review processes) and include any ALG staff:
- Round [#] Application Deadline
- Peer Reviews (range)
- Administrative Reviews (range)
- Online Kickoff Meeting (Teams meeting)
- Be sure to make the Kickoff Meeting date in the calendar a Teams meeting with a link to join. We will use this to invite all grant teams once reviews are complete.
3.1.2 Creating Grant Documents
With all documents, run an Accessibility Check in the appropriate application to ensure all documents are accessible to faculty and staff with disabilities, particularly with structured text and descriptive links. Make any adjustments.
3.1.2.1 Update the Request for Proposals.
- Check to ensure that all dates and round numbers are updated.
- Integrate new priorities based on strategic goals.
- Integrate new clarifications in response to any feedback from previous points of confusion during the past year.
3.1.2.2 Update the Online Application Form, Word Templates, and Rubrics.
- Update the Online Application Form with dates and any changes due to policies or updates to data collection.
- The Application Template Documents (Word) will need updated dates at the very least. Be sure to keep the template and the Online Application Form's fields and questions identical to each other.
- If any changes have been made to the rubrics for peer review, update these as well.
- Duplicate the review spreadsheet from the previous round and delete everything but the headers.
3.1.2.3 Update Kickoff Documents.
- Update the asynchronous Google Sites training for at least rounds and dates. Re-check the training for accessibility and adjust accordingly.
- Duplicate the previous round's Kickoff Meeting presentation and update the rounds, dates, and any changes to procedures. Re-check the presentation file for accessibility and adjust accordingly.
3.2 Process: Issuing a Call for Peer Reviewers
If this is the first round of the fiscal year, create a new Reviewers folder for the new rounds of grants.
- Copy the files from the previous year’s Reviewers folder.
- In the new rounds folder (e.g. “R28-R29), paste the previous Reviewers folder.
3.2.1 Recruiting and Selecting Peer Reviewers
3.2.1.1 Set up the Call for Reviewers Spreadsheet for administrative review.
- Download the Individual Responses spreadsheet in Zoho.
- Click on “ALG Call for Reviewers.”
- Click on “Individual Responses.”
- On the left sidebar in the “Filter by” dropdown, select “Completed Responses.”
- On the top sidebar on the right side, click Export > As Spreadsheet.
- In the next export window, sort the spreadsheet by Newest First.
- Download the exported spreadsheet using the directions provided in Zoho.
- Move the spreadsheet to the current Reviewers folder for the year.
- In the spreadsheet, delete the following:
- First row
- The following columns if Zoho has them in the export:
- Index
- Response ID
- IP address
- User agent
- Response status
- Survey URL (etc.)
- Start Time
- Time Taken
- Collector
- Delete all entries pertaining to the previous year.
- The gap in dates in the “Completion Time” column will be obvious.
- Once previous year entries are deleted, delete the “Completion Time” column.
3.2.1.2 Convert the resulting spreadsheet into a table.
- On the Excel ribbon, click Insert > Table. The range should automatically be selected.
- Delete any extra rows at the bottom of the table if necessary. This will help with Mail Merge functionality.
- Rename long headers.
- “Are you planning to submit, or be on the team for the submission of, an Affordable Materials Grants proposal in the next year ((example year) Round 28 or 29)?” changes to “Submitting R28/R29.”
- “Describe your experience with Open Education and/or affordable learning in detail. (Please provide at least two paragraphs for this answer if possible.)” changes to “ALG Experience.”
- “Describe your experience with evaluating applications, especially grant applications, in detail. (This response can be as short or long as necessary for a sufficient description.)” changes to “Evaluating Experience.”
- “Have you ever been an Affordable Materials Grants awardee, either as a Project Lead or a Team Member?” changes to “Previous Grantee.”
- “Are you currently one of the designated Affordable Learning Georgia Champions at your institution?” changes to “ALG Champion.”
- Add score columns.
- To the right of “ALG Experience,” add an “ALG Score 1-10” column.
- To the right of “Evaluating Experience,” add an “Evaluating Score 1-10” column.
- To the right of “Previous Grantee,” add a “Grantee +2pts” column.
- To the right of “ALG Champion,” add a “Champion +4pts” column.
- Set up If/Then formulas.
- K2 is “Previous Grantee” in this example. In the corresponding “Grantee +2pts” column, add an if-then formula to give “Yes” in “Previous Grantee” an extra 2 points.
- =IF(K2="Yes",2,0)
- M2 is “ALG Champion” in this example. In the corresponding “Champion +4pts” column, add an if-then formula to give “Yes” in “ALG Champion” an extra 5 points.
- =IF(M2="Yes",4,0)
- Copy both formulas down the entire column.
- K2 is “Previous Grantee” in this example. In the corresponding “Grantee +2pts” column, add an if-then formula to give “Yes” in “Previous Grantee” an extra 2 points.
- To the right of all columns, add a “Total” column.
- Add a sum formula to add the ALG Score 1-10, Evaluating Score 1-10, Grantee +2pts score, and Champion +4pts score together in this column.
- Example: =SUM(H2, J2, L2, N2)
- To the right of the Total column, add a “Notes” column.
- Add a sum formula to add the ALG Score 1-10, Evaluating Score 1-10, Grantee +2pts score, and Champion +4pts score together in this column.
3.2.1.3 Evaluate each application.
Note: If more than one administrator is evaluating, have one copy of the spreadsheet per administrator. Sharepoint is not good for live updates with more than one editor.
- ALG Experience: Look for experience with OER, affordable materials, grant projects. These responses should be at least two paragraphs long.
- Evaluating Experience: Look for experience in reviewing grant applications or other applicable evaluation experience, such as administrative reviews for an national, nonprofit, or institutional initiative. These responses should be at least two paragraphs long.
- Enter any notes that would help you or other administrators recall or understand your scoring.
- While potential applicants for a grant round in this semester are ineligible, evaluate all applications including those with a “yes” in this column. If an applicant who said “yes” to submitting an application during this academic year is a top pick, email this applicant to double-check the answer. It’s possible that someone either read or responded to this question incorrectly.
- If you have more than one reviewer, meet as a group to pick your top 15 for the Final Slate.
3.2.1.4 Create the Final Slate and a list of not selected applicants.
- Create a Y/N column for your selected/not selected applicants.
- Sort the table by the Y/N column.
- Create a Final Slate blank tab.
- Copy and paste the Y rows to the Final Slate tab. You’ll only need the first columns up until the Position column.
- Format the new section as a table with corresponding headers.
- Delete the Y/N column (only in the Final Slate table).
- Send your Final Slate to the Executive Director for review.
- Because this table is small enough, it’s often easier to just copy and paste the table into an email instead of attaching a file or linking to a spreadsheet on Sharepoint.
- Create a Not Selected blank tab.
- Copy and paste the N rows to the Not Selected tab. You’ll only need the first columns up to the email column.
- Be sure to put the headers in the first row.
- Format the new section as a table.
- Delete the Y/N column.
3.2.1.5 When approved, send notifications on the notification date.
- Open the “Email-Selected” copied Word document and update this document for the current rounds:
- Change the dates, times, and rounds.
- Replace the Teams invite for the online orientation meeting.
- Connect the Final Slate to the mail merge:
- On the ribbon, click “Mailings.”
- Click “Select Recipients” > “Use an existing list.”
- Find the review spreadsheet and select it.
- Pick the “Final Slate” tab from the resulting menu.
- Fields should update by themselves, but if not, delete any “Error!” fields and replace them by clicking “Mailings” > “Insert Merge Field.”
- Send the mail merge to the selected applicants:
- Be sure that Outlook is open.
- In the Word document, in the ribbon, click “Mailings” > “Finish & Merge” > “Send email messages…”
- Ensure that the “To” field is linked to the Email column in your spreadsheet.
- Update the subject line for the current rounds.
- Click “OK.”
- Open the “Email-Not Selected” copied Word document and update this document for the current rounds:
- Change the rounds and anything that has to do with the selection criteria or competitiveness of the round.
- Connect the Not Selected tab to the mail merge:
- On the ribbon, click “Mailings.”
- Click “Select Recipients” > “Use an existing list.”
- Find the review spreadsheet and select it.
- Pick the “Not Selected” tab from the resulting menu.
- Fields should update by themselves, but if not, delete any “Error!” fields and replace them by clicking “Mailings” > “Insert Merge Field.”
- Send the mail merge to the not selected applicants:
- Be sure that Outlook is open.
- In the Word document, in the ribbon, click “Mailings” > “Finish & Merge” > “Send email messages…”
- Ensure that the “To” field is linked to the Email column in your spreadsheet.
- Update the subject line.
- Click “OK.”
3.2.1.6 Update the Zoho form for peer reviews with either previous data or test data to demonstrate during the onboarding meeting.
- Check for any changes to the rubric.
- Ensure Save/Continue Later is enabled in Settings.
3.2.1.7 Create the new SLAs for reviewers.
- Reformat a copy of the Excel sheet configured for institution-level SLAs.
- In the Reviewers folder, copy the “ALG Call for Reviewers (rounds).xlsx” spreadsheet and paste it into Reviewers > SLA > Unsigned.
- Open the spreadsheet.
- Select the Institution column and cut (CTRL-X) it. Select a cell in column A, right-click, and select Insert > Table Columns to the Left.
- Sort the table by institution.
- Concatenate the first and last names.
- Insert a column after “Last Name” titled “Name1.”
- In the first non-header cell, input the formula: =CONCAT([@[First Name]], " ", [@[Last Name]])
- Copy the formula down the column.
- Copy the column values. In a separate column outside the table, right-click, then “Paste,” then “123” (Values).
- Replace the formula column with this values-only column.
- Delete First Name and Last Name.
- In the institution column, count the institution with the most reviewers. Make a “Name” column for that many reviewers to the right of Name1: Name2, Name3, etc.
- Move the names in Name 1 to the first of each Institution’s row in the Name2, Name3, etc. columns.
- Do the same with the Email Address column: Email1, Email2, Email3, etc.
- Delete the duplicate institution columns.
- To the left of the institution, create a column titled “SLA.”
- Enter the name of the SLA for each institution.
- Examples: REV2026-Augusta; REV2026-GSU
- Enter the name of the SLA for each institution.
- To the right of all other columns, insert a new table column titled “Amount.”
- Format this column as Text.
- Using the format $X,XXX.XX, enter the dollar amount per institution. This amount is $1,100.00 x the number of reviewers.
- Example. GHC has three reviewers. The amount is $3,300.00.
- Open the “SLA Template Reviewers (previous year).docx” file.
- Click “Find data source…” and select “ALG Call for Reviewers (rounds).xlsx”. Select the tab “Final Slate.”
- Click “Preview” to ensure that all merge fields carried over from the spreadsheet to the document.
- Ensure that the Statement of Work has the right amount of fields for the maximum number of reviewers (Name1, Name2, etc.)
- Ensure that any dates and numbers for the rounds are up to date.
- Use the Timeline document for the year to get peer reviewing dates.
- Distribute SLAs with instructions per previous template examples.
3.3 Process: Preparing for Peer Reviews
3.3.1 Preparation Prior to Application Deadline
The day after the application deadline is the day all applications are processed and peer reviews are assigned. That day requires an intense set of processes; any preparations that can be done in advance should be done before this date. Even though the preparation phase is technically one process, this SOP separates out the procedures ALG staff can do before the application deadline for efficiency purposes.
3.3.1.1 Set up the Reviews folder.
Note: In this step, it’s good to know that Ctrl-Shift-N is the keyboard shortcut to create new folders in Windows.
- In the grant round’s folder (example: within “R26-R27,” the “R26” folder), create a “Reviews” folder. In that folder, create the following new folders:
- Accepted Proposals
- After the reviews, this is where you will copy the awarded proposals for SLA and uploading purposes.
- Applications
- This is where you will move all downloaded application files from the Zoho Excel Sheet.
- Forms
- This is where you will move all Grants Office Acknowledgement Forms from the Zoho Excel Sheet.
- Letters of Support
- This is where you will move all Letters of Support from the Zoho Excel Sheet.
- Mail Merge
- After the reviews, this is where you will set up the awarded, rejected, and clarification emails in Mail Merge through Outlook and Word.
- Peer Reviews
- This is where you will “print” all Peer Reviews to PDF to share them with applicants.
- Other Excel
- Zoho Excel Sheets and other Excel sheets aside from the main Reviews sheet will go here to keep the Reviews file clear and organized.
- Accepted Proposals
3.3.1.2 Create a new Excel file for the Reviews Sheet.
- Create a copy of the previous round’s “Reviews” spreadsheet. This spreadsheet should be in the round’s Reviews folder.
- Move this copy to the new round’s Reviews folder.
- Open the new copy and select everything but the top row of each sheet. In the Home tab of the ribbon, click the eraser icon and use “Clear All” to remove all values and formatting.
- Do not clear the Rubrics tab; the rubrics are here for reference while reviewing.
3.3.1.3 Create a new Peer Reviewers and Assignments Sheet.
- In OneDrive, copy the Peer Reviewers and Assignments spreadsheet from the previous round to the Reviews folder. Open this new copy.
- Move the current list of selected Peer Reviewers (from the Call for Proposals spreadsheet) to the Reviewers tab.
- To get the “Full Name” column, use =CONCAT([@[First Name]], “ “, [@[Last Name]]). Copy the column’s results and paste them in as values only to remove the formula.
- Sort this table by Full Name.
- Rename the “(Round Number) Assignments” tab’s title to the current round (e.g. “R28 Assignments”).
- In the assignments tab, delete the top row’s names and the 2nd row’s institutions.
- Replace these with the current names and institutions.
- Select the Full Name column and copy (Ctrl-C) those values.
- Right-click the first cell for names (should be in column 3) and select Transpose.
- Select the Institution column and copy those values.
- Right-click the first cell for institutions (under the names) and select Transpose.
- You should now have what looks like two header rows and we need these to be one row for accessibility purposes.
- In the third row, combine the institution and the name:
- =CONCAT(C2, “ “, C1)
- Copy this row and paste only the values in the first row.
- Delete the second row.
- The first row should now read with the institution first and the name second, e.g. “USG Jeff Gallant”
- Clear the previous proposals from the sheet.
- Below the names, select everything but the count formulas located on the right and bottom of the spreadsheet.
- In the Home tab in the ribbon, click the eraser and select “Clear all” to remove the previous grant rows.
- Move the count formulas on the right of the spreadsheet to the first column after the end of the names row.
- In the PivotTable tab, select Refresh All to ensure the values are cleared.
3.3.1.4 Create and link Google Drive folders in the Peer Reviewers and Assignments Sheet.
Note: This should only happen during the first round of the fiscal year. In the second round, these folders and sheets should already exist. Clear the Google Drive folders, ensure any round information is up to date, and they’re ready for the next round of reviews.
- Using the ALG account (galileovideo), create a Google Drive folder for each reviewer’s name within ALG > Peer Reviews > (current year of grants, e.g. “R28-R29”).
- In Google Drive, select all the reviewer folders for this round. They should already be in alphabetical order by full name. Right click, then select “Share,” then “Copy Links.”
- Paste the links into the Reviews sheet and reformat it as a links column.
- Paste the links into the Reviewers spreadsheet past the table. Keep the one pasted cell selected.
- In the ribbon, in the Data tab, click “Text to Columns.”
- Columns are separated by delimiter.
- Delimiter is a comma and a space. Check the box that treats consecutive delimiters as one delimiter.
- This should get you a row of Google Drive links. Select the row and copy (Ctrl-C).
- In the Gdrive column, right-click the first cell and select Transpose (the two-arrows icon). This should paste the Gdrive links in full-name alphabetical order. Check a few of these to make sure.
3.4 Process: Launching Peer Reviews
The following can only happen on the day after the deadline for grant applications.
Warning: Time-Bound and Intense Processes
The portions of this work that enable peer reviewers to review their applications (from this section until the end of 2.4.2 “Send a Mail Merge for peer reviewers”) need to be completed on the day after the application deadline. This is a tight timeline. If you are the Program Director and/or Program Manager responsible for these procedures, keep your calendar open on the day after the application deadline if possible, but at least keep the morning (until noon) open to get the peer reviewer portions done.
While the Administrative Reviews should be finished by the end of the same day to ensure all ALG staff have sufficient time to review, extra time is built in for these reviews at the end; a delay in Administrative Reviews is not as pressing as a delay in Peer Reviews.
3.4.1 Prepare spreadsheets and files for peer reviews.
3.4.1.1 Export and download the Zoho Excel Sheet.
- In “Individual Responses” on the left pane, filter the reports to “Completed Responses.”
- On the top right, click Export > As Spreadsheet.
- Select the option to put the newest reports on top and click Export.
- The export process will start and a link to download the file will be in Exports on the left side of the page.
- Move the downloaded Zoho Excel Sheet to the Reviews folder.
- Open the sheet and delete the first row, which is only an organizing row for Zoho.
- Using Insert > Table, format the sheet as a table.
- You will get a dialogue box that says that header cells will have characters over the 255 limit removed. This is fine, click OK.
3.4.1.2 Move the Zoho Excel Sheet applications to the Reviews spreadsheet.
- Copy all application rows from the current round.
- Check the Start and Completion Time columns. You are looking for all rows until you reach the date range of the previous round. There’s usually an obvious gap between the previous round’s deadline and the first application of the current round.
- In the Reviews spreadsheet, in the Responses tab, starting with the “Index Number” column (currently column S), paste all the application rows from Zoho into the Reviews sheet.
- This should be one paste action and that’s it. We’ve set up the Responses tab to have a column corresponding to every Zoho review sheet column in order to make this happen.
- Do not use the “Paste Values” option in this case. The hyperlinks to each Word application, letter, and form are part of the formatting in the Zoho spreadsheet.
- Sort this sheet by the TypeOfGrant column (currently column AB).
- In the Grant # column (column A), number the applications.
- In the previous round’s Reviews spreadsheet, check the last application numbers for Transformation, CI, and Research in the previous round’s Excel file. You may need to rename the last round’s spreadsheet since both files might be named “Reviews.”
- Start numbering the new applications with the numbers following the last application numbers for the last round.
- Sort this sheet by Grant #, smallest to largest.
3.4.1.3 Download, rename, and move all application files to their respective folders.
Note: For these steps, it may be helpful to know that you can rename files without a mouse. F2 is Windows’ shortcut to rename a file. Tab moves to the next file on the list. You can hit F2 on the first file, then start a loop: enter the filename, then Tab, then enter the filename, then Tab, and so on.)
- First, copy the WordApplication, Letter, and Form columns and paste them directly to the right of the Grant # column (A). Remember to delete these helper columns after you have finished this task.
- Using the ApplicationWord column’s links, download, rename, and add the applications to the round’s Applications folder.
- Using a small window for Excel and a small window for your browser, click on the Excel links in order. The browser will download each file from the URL.
- In the download folder, using the same order, rename the application files to their respective grant numbers.
- Example: 702.docx
Note: The next filenames all include a hyphen and word after the number for uniformity. To make this easier, highlight the hyphen and word and hit Ctrl-C to copy it. Then, while renaming all the files, just hit Ctrl-V after each number to add it in.
- Using the AcknowledgmentForm column’s links, download, rename, and add the Grants Office Acknowledgment Forms to the Forms folder.
- Using a small window for Excel and a small window for your browser, click on the Excel links in order. The browser will download each file.
- In the download folder, using the same order, rename the application files to their respective grant numbers followed by “-form.”
- Example: 702-form.pdf
- Using the LetterSupport column’s links, download, rename, and add the Letters of Support to the to the Letters of Support folder.
- Using a small window for Excel and a small window for your browser, click on the Excel links in order. The browser will download each file.
- In the download folder, using the same order, rename the application files to their respective grant numbers followed by “-letter.”
- Example: 702-letter.pdf
3.4.2 Assign peer reviews using the Peer Reviews and Assignments sheet.
- Add the application numbers and institutions for each application to the sheet.
- For all cells where a reviewer’s institution in the column is the same as the application’s institution in the row, enter an X.
- In the past, we filled these cells with a black color; this is inaccessible to screen readers. Instead, use conditional formatting to make every X have a blacked-out cell. This is a universal design upgrade: the process is easier and keyboard-operable.
- Assign reviews to each reviewer with a Y. Three per Transformation Grant, one per CI and Research Grant.
- The Count formulas on the outside of the table should be aligned with the number of reviewers on the right (pointing to the correct amount of columns) and the number of applications per reviewer assigned (pointing to the correct number of rows).
- Add a PivotTable to the next tab pointing to this assignment table.
- In the PivotTable fields, Columns = the name of the reviewer. Rows = Application number. Filter the table by “x.” You’ll get an easy list of what to assign to each reviewer in their Google Drive folders.
- Drag-and-drop the set of applications for each reviewer into their Google Drive folders. Ensure everything uploads correctly.
3.4.3 Send a Mail Merge for peer reviewers.
- Ensure the following (as these have gone wrong in the past):
- The Peer Review Zoho form is up to date with correct reviewer names and rubric fields.
- Google Drive links in the Reviewers spreadsheet point to the correct reviewer’s name.
- Every reviewer’s Google Drive folder has the appropriate applications.
- In the previous round’s reviews folder, within the Mail Merge folder, copy the “Email-R##Applications” file (##=that round of grants) to the new round’s Mail Merge folder. Open this file:
- Using the Mailings tab (or a dialogue box that may pop up immediately), point this document toward the new Peer Reviewers and Assignments Excel file and the Reviewers table.
- Replace any dated language and any old links with current language and links.
- Send the mail merge using the “email” field. Be sure to update the subject line.
- Keep an eye out for emails coming in during the first day and be ready to fix things – this is a large process and things can get missed, skipped, or glitchy.
3.5 Process: Launching and Performing Administrative Reviews
NOTE: These can be started during the Peer Reviews period to spread the work out over a more feasible amount of time.
Administrative reviews are important to the grant review process in balancing peer reviews with an evaluation based directly on the overall objectives of the program and official USG and GALILEO policies. These reviews also assist ALG in fully understanding each proposal’s objectives, plan, possible impact, and budget. ALG administrators score each proposal based on the same rubric used by peer reviewers.
ALG also performs additional technical and procedural checks, such as validating the links given for the original price of materials, Letters of Support, and Grants/Business Office Acknowledgment Forms. ALG also must ensure that any Transformation Grant awarded does not duplicate a past project within the same institution in the same course.
Any clarification questions the administrators have with the proposal are recorded, and if a proposal is considered for an award, those clarifications are first sent to the applicant and must be clarified before the award is officially given.
After the peer review process has ended, all peer reviews are checked, weighed, and averaged, and qualifying priority points are added to both the peer review averages and the administrative review scores. These are then compared with each other. All applications are then discussed, and a final slate is approved by ALG and the Executive Director of GALILEO.
3.5.1 Prepare the Reviews spreadsheet for administrative reviews.
3.5.1.1 Clean application data in the Excel sheet.
This next task involves cleaning the numerical data entered by applicants- a necessary step in the process since the varied ways applicants format their numerical answers can cause errors and miscalculations otherwise. If an applicant gives a range of numbers where one average number should be (like “$50-150”), you have two choices: either average the two numbers in the range to find the midpoint ($200/2=$100 in this case) or contact the applicant for their own reported average.
What you do with a range may be a judgment call based on the rest of the application: for example, if it’s stated that most students do not purchase the $150 copy and opt for the $50 version, savings are more likely to be $50. If it isn’t clear and applicants would not be able to get you an average other than the range’s midpoint, just use the midpoint.
This is why we ask applicants to provide us with total annual savings numbers: while we can calculate those here, we have no idea if something else is happening with their calculations beyond what’s provided in the spreadsheet. If their annual numbers are based on something they know about the range they gave us, that’s important information.
Every so often, an applicant will either miss a cell or mix up the math involved in the application process. If total savings numbers for Transformation Grant projects are coming up as $0, there’s a big chance this happened on the Zoho form. Check on the Word application to see if they have non-zero savings numbers entered under Impact Data and fix accordingly.
- Clean the numerical columns in the Responses tab.
- Award Amount (currently AH):
- Create a new table column to the right. No title necessary.
- In the 2nd cell of the new column, use the FIXED formula on the corresponding Award Amount cell.
- =FIXED(AH2, 2)
- This converts the many different types of numerical answers applicants provide into a fixed two-decimal number format.
- Copy this formula throughout the column.
- Copy the column’s results.
- Paste the columns results as values (right-click / 123 button) into Award Amount.
- Delete the column you created for the formula.
- Course Numbers: Clean this column so that each is a list of course numbers separated by commas.
- Course Names: Same as Course Numbers.
- Award Amount (currently AH):
- Link the header cell in Grant # to the Applications folder.
- In the Reviews folder:
- Right-click the Applications folder.
- Select "Share."
- Next to the "Copy Link" button, click the gear (Settings) and select "Only people with existing access."
- Click "Apply."
- Click "Copy Link."
- In the Responses tab:
- Right-click the "Grant #" heading.
- Select "Link."
- Paste in the link to the folder.
- In the Reviews folder:
Note: In past rounds, we would link each .docx file to each grant number in the "Grant #" column. This was feasible with twenty to thirty applications; it is infeasible with the sixty or more applications we may receive now. The new "Grant #" link procedure brings viewers of the Excel file directly to the folder to view all applications.
3.5.1.2 Clean Transformation Grants-exclusive columns.
- NoCostLowCost: Copy a No-Cost cell, then CTRL-F. Paste the No-Cost text into the Find bar and enter "NC" in the Replace Bar. Replace all. Do the same for the Low-Cost cells with "LC."
- 1YrROI: Find the product of the total annual savings for the project minus the award amount:
- In the top non-header cell (row 2), enter the formula =[@TSavingsPerYear]-[@Award Amount] and hit Enter.
- Copy this formula down the column for the rest of the projects.
- It’s helpful to do conditional formatting here for your own purposes to quickly visualize when the ROI is less than zero.
- 3YrROI: Find the product of the total annual savings multiplied by three minus the award amount:
- In the top non-header cell (row 2), enter the formula =3*[@TSavingsPerYear]-[@RequestAward] and hit Enter.
- Copy this formula down the column for the rest of the projects.
- It’s helpful to do conditional formatting here for your own purposes to quickly visualize when the ROI is less than zero.
3.5.1.3 About ROI Values
ROI values are not to be used as strict “accepted/rejected” values in evaluating proposals, as outcomes from projects are both monetary (student savings) and non-monetary (student success, engagement, instructional improvement, etc.). They are not to be interpreted as comprehensive ROI values. Comprehensive ROI values, largely used in a for-profit business setting, do not just include the material and labor costs versus profit; they include various costs of doing business, such as the current interest rate when borrowing money to fund a project. The ROI values in this spreadsheet are not taking into account external costs like ALG staff time, the cost per use of a repository, time and staff costs associated with the distribution of funds from either the system office or the institution’s offices, and costs for managerial functions taking place outside of ALG. ALG's ROI values are indicators exclusive to the grant program, looking only at how much students are affected in textbook cost savings compared to the amount of the grant award over the course of a year of implementation.
Some projects with a low one-year ROI but a great plan (and therefore a high chance of being a sustainable project over many years) would be a great fit for a Transformation Grant. Other teams may have picked the wrong category given the larger amount of funding where a Continuous Improvement Grant might fit the proposal better. The 1-year ROI is an easy way of bringing multiple numbers together to see what the project is doing for student savings, but it isn’t the whole story. ALG will often see a project with a low 1-year ROI but a high Peer Review score, and depending on the competitiveness of the round, that project may be awarded.
This is why ALG added a 3-year ROI value: it does not make sense that ALG would assume projects all discontinue after a year. A 1-year estimate is a conservative one that accounts for things going wrong or abrupt turnover happening, but a 3-year estimate is centered more around our experiences with most projects continuing past one year. If the 3-year ROI is under $0, there’s likely a major problem with the project’s student savings impact in relation to the requested budget.
3.5.2 Assign administrative reviews.
- In the Responses tab, in the Reviewer column, assign an administrative reviewer name to each application row.
- Distribute as evenly as possible across categories.
- ALG has checked for trends in quality and scores between early-arriving and later-arriving applications: these do not seem to correlate. Accounting for the application submission date and time is, at the moment, unnecessary.
- In the AdminReviewsT tab, copy and paste both the Responses tab’s Grant # and Reviewer columns in their corresponding columns on this tab.
- Do the same for AdminReviewsCI and AdminReviewsRG.
- Distribute as evenly as possible across categories.
3.5.3 Review assigned Transformation Grant applications.
3.5.3.1 Perform File and Price Checks.
In the AdminReviewsT tab, for all applications, one column at a time:
- Perform a Letter Check.
- A Letter Check is a quick read of the Letter of Support for the application in the LetterURL column. Letters of Support must contain the following:
- Departmental and Institutional Support: upper-level approval for the project’s work and funding.
- Sustainability and Resource Commitment: the implementation will be sustainable beyond the grant award period.
- Collaborator and Multi-Departmental Engagement: All collaborating departments have been informed and support the project.
- Institutional Oversight and Accountability: Confirmation that all team members are authorized to participate.
- In the Letter Check column, if the check is successful, enter Y. If not, enter N with the reason why.
- A Letter Check is a quick read of the Letter of Support for the application in the LetterURL column. Letters of Support must contain the following:
- Perform a Form Check.
- A Form Check is a quick check for the Business/Grants Office Acknowledgement Form that it’s completed and signed.
- In the Form Check column, if the check is successful, enter Y. If not, enter N with the reason why.
3.5.3.2 Review applications.
In the AdminReviewsT tab, for each application, one by one:
- When starting to read the application, perform a Price Check.
- A Price Check looks at the URL(s) provided for the original cost of commercial materials.
- If the check is successful, enter Y. If not, enter N with the reason why.
- An acceptable answer is “Y varies” where the price looks like it varies slightly from day to day, which can be common in the publishing industry.
- Evaluate each application.
- See the Rubrics tab for a full list of all rubrics and categories. Each score is on a five-point scale with five as the highest score.
- Enter only a numeral in the Score columns (e.g. T1Score).
- Enter text in the Comment columns (e.g. T1Comment)
- In the Administrator Comments column, enter your overall comments on the proposal or anything that does not correspond to the rubric comments.
- The Weighted Review Score column should calculate on its own.
- The Total Score column should add the Priority Points to the Weighted Review Score.
- After evaluation, perform a Priority Check.
- For all priorities selected, enter which priorities are awarded in the Priorities Awarded column. Use a one-word abbreviation for each priority; this makes it easier to count.
- Enter the number of priorities awarded in the "#PrioritiesAwarded" column.
- The Priority Points column should calculate as a result of entering this number.
- Proceed to the next application.
3.5.4 Review assigned Continuous Improvement Grant applications.
- In the AdminReviewsCI tab, review each Continuous Improvement Grant application assigned using the rubric.
- Perform a Letter Check per the instructions in AdminReviewsT.
- Perform a Form Check per the instructions in AdminReviewsT.
- Read over the application. Then:
- Evaluate each application for its rubric categories per the instructions in AdminReviewsT.
- The Total Score column should calculate on its own.
3.5.5 Review assigned Research Grant applications.
- In the AdminReviewsRG tab, review each Research Grant application assigned using the rubric.
- Perform a Letter Check per the instructions in AdminReviewsT.
- Perform a Form Check per the instructions in AdminReviewsT.
- Read over the application. Then:
- Evaluate each application for its rubric categories per the instructions in AdminReviewsT.
- The Total Score column should calculate on its own.
3.6 Process: Processing Peer Reviews
3.6.1 Export, copy, and process the peer reviews.
Note: From this point onward, these tasks can only be completed the day after the peer review deadline has passed.
- Go to the Zoho peer reviewer form.
- In “Reports,” select “Individual Responses.”
- On the top right corner, go to “Export” and select “As Spreadsheet.”
- In the next dialogue box, if there is an option for “Multi-Answer Question Format,” select either “Separate Columns” or “Both.”
- In Response Ordering, select “Newest First.”
- Download the exported sheet and move it to the round’s Reviews folder.
- Give it a unique title with the date, as this will go to the “Other Excel” folder after.
- Open the exported form.
- Delete the first row and format the rest as a table.
- Delete the following Zoho-generated columns if they appear: Total Rating, Scored Rating, CSAT Score, Average Rating (out of 10)
- Delete all rows of reviews that took place before your current round.
- Use the Completion Time column to find this easily.
- Copy the non-header rows into the Reviews sheet in the Peer Reviews tab. These should align so that the calculation columns (from “TG1” onward at the end) automatically work.
- Filter the Response Status column to pick out any “PARTIAL” responses. If they’re blank responses, delete those rows. If not, ensure there’s an updated “COMPLETE” response from that reviewer for that application and then delete the “PARTIAL” response. You may need to follow up with the reviewer on this response in the next step.
3.6.2 Check peer review completion and send confirmation emails.
- In the “PeerPivot” tab, refresh your results in the PivotTable Analyze section of the ribbon.
- In the “Reviews” folder, open the “Peer Reviewers and Assignments” spreadsheet.
- Ensure that each reviewer has reviewed the correct applications and amount of applications.
- If a reviewer has a wrong application number listed or two of the same number listed, check those out and amend as necessary. Reach out if a fix is unclear.
- In the “Peer Reviewers and Assignments” spreadsheet, in the “Reviewers” tab, fill in the (current round)Completed column (y/n).
- Filter this table by (current round)Completed “y.”
- Email this group to confirm:
- They have completed all their required reviews for Round 28.
- We are ready for an invoice once their institution’s SLA is fully-executed.
- Remind them that if you’re waiting for USG signatures, it can take Legal up to one month after we route the SLA to obtain USG signatures, as their office is quite busy. Check in with us if it’s after a month past us receiving the institution-signed SLA.
- Include a table with each reviewer and the status of all SLAs.
3.6.2.1 Email Example
Subject: ALG Round 28 Reviews Completed! Invoice Directions and SLA Status
Good morning, Round 28 Reviewers:
If you're receiving this email, you have completed all of your assigned reviews for Round 28 on-time. Thank you so much! That's over 120 meaningful reviews with comments that will not only heavily influence the final grant award decisions, but also give context to those who weren't selected in order to revise and resubmit in another round.
If your institution's SLA has been fully-executed (signed by your institution and by the USG), we are ready for your first invoices. These invoices should follow the directions given on the SLA for one completed round of grant reviews. Please work with your contacts within your offices and/or your Grants/Business Office to get an institutional invoice sent to me. I've included the main contacts from your institution in this email. We'll then route the invoice for signatures from here.
Please do not send the invoice directly to Accounts Payable, as this will delay the process. We need to make a formal Payment Request first.
Below is a table of the status of everyone's SLA. If yours has not yet been fully-executed, please keep in mind that USG Legal can take up to one month after we route it to them to sign it. If it's been over a month since you saw the institution-signed SLA sent over to me, please check in!
Best,
Jeff
[Table with SLA information]
Note: Send a check-in email about the invoices one week later and one month later at the very least. Possibly each month by marking a date to do so on your calendar.
3.7 Process: Processing Administrative Reviews
3.8 Process: Making Final Decisions
(Enter 2.7 and 2.8 as you complete this task once admin reviews are completed.)
(After section 2: meetings afterward. Kickoff and Midpoint, asynchronous training and deadlines)
4 Major Process: Final Reports and Accessibility Checks
Note: The following processes can be completed before the end of the Final Report deadline.
4.1 Process: Processing Final Reports
4.1.1 Preparing for and Receiving Final Reports
4.1.1.1 Create a folder for the semester.
- In Grants / Final Reports and Materials / [Fiscal Year] / [Semester], add a new folder, e.g. “Spring 2024.”
- In the semester folder, create an Invoices folder.
4.1.1.2 Create a Final Report Excel file.
- Copy the previous final semester's Excel file in Grants / Final Reports and Materials / [Final Semester].
- In tabs except for Metadata, use Clear All to remove all rows except the headers.
- Place the copy in the new folder for the semester.
- Using the Data > Final Report Materials List dashboard in Power BI, copy the table for all projects with the current Final Semester:
- Change the Final Semester slicer to the current semester.
- On the table, click the three dots in the upper right-hand corner and select "Export Data."
- Export the data to a CSV.
- Using the new CSV, add all projects with the current Final Semester to the Final Report Excel file.
- Note: "A" and "B" projects (example: 620.a, M320.b) are not necessary for materials processes. Consolidate these into one row per project for now.
4.1.1.3 Check for any amended delays.
- Because delay requests are done by email, be sure to check your email to see if any delay requests happened before starting the rest of the final report processes.
- For each time a delay was requested and accepted via email, be sure to include the word "delay" and include the project number when replying to the email.
- Search for the word "delay" to find all delay requests for the current semester.
- Ensure any delays past the 31-day built-in delay have an amendment process started.
- Enter the delay status for delayed projects in the "Delay" column.
Note: The following processes cannot be completed until after the Final Report deadline. Doing a preliminary run of some of these processes before the deadline, like uploading files for early-arriving reports, can help make the morning after the deadline easier.
4.1.2 Processing Final Reports
4.1.2.1 Export and download the Zoho Survey spreadsheet.
- Log into Zoho Survey.
- Select the ALG Final Report Submission Form.
- On the left sidebar, click "Reports."
- On the top-right corner, click "Export" and then "As Spreadsheet."
- Change Response Ordering to "Newest First."
- Click "Export."
- In the next dialog box, click "Go to Exports."
- Click "Download" on the right to download the new spreadsheet.
- Add this file to the semester's Final Report folder.
4.1.2.2 Customize the spreadsheet for readability and downloads.
- Open the Zoho Survey spreadsheet.
- Go to the "Responses" tab.
- With the exception of the “Response Completion Time” column, delete everything up until the "Which type of grant are you reporting?” column.
- Delete the top blank row.
- Create a table out of the resulting columns.
- In the ribbon, click "Insert" and then "Table." Ensure the table has the first row as the headers.
- Delete all but the current round’s applications.
- Sort by Response Completion Time. There should be a significant gap between the first submission for the current round and the last submission for the previous round.
- Delete all rows except for current round submissions.
- Move the "ALG Grant Number" column to Column A.
- Look down the column to check for errors:
- Duplicate reports: Keep the completed report with the latest completion date and delete the earlier instances.
- Be sure to check the entire row. If the latest version of the report isn't complete, keep the complete report and delete the incomplete one.
- Non-standard grant numbers: Using the Materials spreadsheet, find the standard grant number and correct it.
- Duplicate reports: Keep the completed report with the latest completion date and delete the earlier instances.
- Sort the sheet by ALG Grant Number and re-check for duplicates.
- Move the columns near the end of the sheet containing links to the Word report, syllabi, materials, and web links to materials to columns B, C, D, and E for quick access.
4.1.2.3 Download, rename, and upload files to their project's folders.
- In the Zoho Survey spreadsheet, download one row of files.
- Rename these files using the Grant # based on the download: [#]-report, [#]-syllabus (or [#]-syllabi), [#]-materials.
- e.g. 774-report, 774-syllabus, 774-materials.
- Create the semester's Final Report folder using only the grant number (e.g. 774), and move the files to the folder.
- In the Zoho Survey spreadsheet, if there is a URL in the Materials Link column:
- Create a text file (.txt).
- Paste the URL into the text file.
- Name this file [#]-materials.txt. Note: This is in order to keep a record of non-local materials when adding metadata.
4.1.2.4 Log all submitted Final Reports.
- In the Materials Excel file, in the Received column, enter "Y" if you have received the grant project’s report.
- In the Received column header, filter the Excel file for only blanks.
- If the Delay column has a 31-day delay, enter "Delay." If it has an amendment, enter the new semester (e.g. “Spring 2027.”) the Delay column does not have a delay, enter "N."
- A 31-day delay is built into the SLA for common emergencies and project delays. Any longer than that, and we will need a signed amendment to the SLA extending the deadline of the project. ALG prefers that these delays correspond to another semester’s deadline.
- Because 31-day delays are often done via email, do one last check in the resulting "N" entries for any email sent by the Project Lead to ensure you haven't received a delay request or a report that did not arrive through the form. Make any necessary modifications.
4.1.2.5 Account for all missing Final Reports and materials.
- Send an email to all project leads with a missing Final Report:
- In the Received column header, filter the Excel file for only "N."
- Email all Project Leads in this list with a small table showing which projects, which courses, and which Leads are associated with the grant number.
- Send an email to all project leads who previously delayed their Final Report:
- In the Received column header, filter the Excel file for only "Delay."
- Email all Project Leads in this list with a small table showing which projects, which courses, the delay date, and which Leads are associated with the grant number.
- Note: This is just a reminder email for delayed projects. More delayed projects than non-delayed projects tend to suffer from leads forgetting their deadlines. Do this at least one week before the delayed deadline.
- Check for complete materials.
- Read through the Final Report and ensure that any materials listed are included in the folder or in a link within the report.
- If there are materials for multiple courses, create materials rows for each course.
- Insert the amount of blank rows to cover each course’s materials.
- Copy the project's original row into these blank rows.
- Manually replace the Course Titles and Course Numbers column with one course per row.
- Open the materials and check for completeness.
- In the Materials spreadsheet, mark "Y" or "N" in the Complete column. If this is an adoption or a research grant without materials, mark "Y."
- If there are missing materials, email the Project Lead about this immediately.
4.2 Process: Performing Accessibility Checks on Open Materials
4.2.1 Overview
As of Spring 2025, all resources submitted to ALG as deliverables for a completed grant project are first evaluated by ALG staff prior to publication. This accessibility check ensures that educational materials have the following features, which apply to the ADA Title II requirements for conventional documents. This accessibility check does not ensure 100% compliance; many accessibility requirements will be case-by-case depending on the type of file, and third-parties may be hosting the materials outside of ALG. However, it ensures that the most common accessibility issues with OER in the form of conventional documents are addressed and that the materials are likely ADA Title II compliant to the best of ALG's knowledge and capacity.
4.2.1.1 Criteria
4.2.1.1.1 Alternative Text or Captions
When images or videos are used to convey information in a resource, there must be a text alternative to convey the same information. For images in a text document, either alternative text or captions must be provided: providing both generally creates a repetitive reading for screen readers.
When images are purely decorative and do not convey information, alt text is not required. Marking decorative images as decorative within the creation platform is preferred but not required.
4.2.1.1.2 Structured Text
When text is used to convey information in a resource and contains multiple sections and multiple pages, the text must be structured in a way screen readers are able to recognize. This allows screen readers to navigate between sections without reading the entire text.
Exceptions for structured text are made with small documents which do not require structural navigation by a screen reader. An example of this would be a one-page Word document containing a title and description for an essay assignment.
4.2.1.1.3 Color Contrast
When text is used to convey information in a resource, contrast between the background and the text must meet or exceed a luminance ratio of 4.5:1. Because this is a technical requirement, contrast is checked by an accessibility checker if this check is available. When this check is not available, contrast is checked visually by staff and will involve subjectivity in the evaluation. Contrast can be adjusted by the reader depending on the platform, such as the Immersive Reader in Microsoft Word.
4.2.1.1.4 Reading Order
When information is divided into multiple sections on a page or a slide, this order of this information must be structured in a way screen readers are able to recognize. Without a correct reading order, screen readers are unable to read this information in the order intended by the author.
4.2.1.2 Case-by-Case Issues
Less-common resource types can be reviewed for case-by-case accessibility issues based on the nature of the resource. These resource types include:
- Spreadsheets and tables
- Mathematics graphs and equations
- Music notation
- Audio-only resources such as podcasts
- Raw code files such as Python or R
- Non-English language text
4.2.1.3 Remediation and Capacity
The extent of reviews and remediations of issues specific to these types of resources will depend on the capacity of the grant team and ALG staff. If a case-by-case accessibility issue is encountered, ALG adds a description of the issue to the entry's description or abstract field.
4.2.1.4 The Evaluation Process
Evaluations of each set of resources are targeted rather than comprehensive due to ALG staff capacity. ALG staff evaluate a sample of each file type submitted by a project team for these four criteria: generally, if at least two of the same file type are evaluated and found to be accessible, the rest of the files of that type are likely to be accessible. There are instances where project teams are large and files are individually authored by different team members or mixed with adopted or adapted OER from external sources: in this case, more files are evaluated to account for various authors.
4.2.1.5 Status
Educational resources hosted by ALG (as listed in ALG Commons) prior to our Fall 2024 documents (added circa January 2025) were evaluated for the four criteria above in a 2025 accessibility audit by ALG staff. All entries in ALG Commons, including pre-existing resources, are being assigned an Accessibility Status based on these four categories:
4.2.1.5.1 Compliant
After testing by Affordable Learning Georgia staff, the resources in this entry were found likely to be ADA Title II compliant.
4.2.1.5.2 Non-Compliant
Resources include pre-existing conventional electronic documents that are not ADA Title II compliant and should not be used to apply for, gain access to, or participate in an institution’s services, programs, or activities.
4.2.1.5.3 Third-Party
Resources are hosted outside of Affordable Learning Georgia. Authors must ensure that resources are ADA Title II compliant in order to be used to apply for, gain access to, or participate in an institution’s services, programs, or activities.
4.2.1.5.4 Report
This resource is a report which is provided by Affordable Learning Georgia for historical reference. This is not an educational resource and it is not intended for required access or participation in an institution's services, programs, or activities.
4.2.2 Evaluations by File Type
- Check the list of materials within the folder for each grant project.
- In the Excel Sheet, in the File Type column, enter the types of files included separated by commas: "PDF, Word."
- For all file types, check at least two of each type of file.
- If the accessibility status is identical, confirm the accessibility status.
- If the accessibility status is not identical, check more files to get an overall picture of what is and isn’t accessible. After that, proceed to 3.2.3.
Note: Unlike the majority of the SOP document, the following subsections are not in chronological order. The subsections pertain to accessibility evaluation processes exclusive to file types:
4.2.2.1 Word Documents
- Open the Word file.
- In the ribbon, click "File," "Info," "Check for Issues," "Check Accessibility."
- The Accessibility Checker only works with current DOCX files. Other filetypes opened in Word will have to be converted to DOCX. If Word prompts you to do this, go ahead and convert it to a current DOCX file. If it doesn't, click "File," "Save As," and save the file as a .docx file.
4.2.2.1.1 Alternative Text
- The Accessibility Checker has a space for alternative text.
- If there are only a small amount of images without alternative text and you are able to describe them, this is an easy fix.
- If there are a large amount of images without alternative text, or if the images require subject specialization in order to describe them, you will need to contact the author.
- There is a way to mark alt text images as decorative using a checkbox. This is applicable only if the images are not conveying any information to the reader.
- If this is complete, in the Materials spreadsheet, enter "Y." If not, enter "N." If there are no images, click "n/a."
4.2.2.1.2 Structured Text
- The Accessibility Checker has a space for structured text issues.
- In the Navigation Pane on the left side of the page, you should see a list of all headings.
- If the left side is blank, there are no headings. This is normally fixable by us if this requires only chapter-level or a small amount of subchapter-level divisions. If there are an infeasible amount for us to do, we'll need to contact the author.
- To fix the headings:
- Go to the chapter title and highlight the title.
- In the ribbon, click Home.
- Go to the first place that should be a Heading 1. This is normally a chapter.
- In the Styles pane, click on the small arrow in the bottom-right corner of the box to bring up an easier-to-use Styles menu.
- Right-click Heading 1.
- Click "Update Heading 1 to match selection." This ensures that the Heading 1 style won't change the style in the textbook itself.
- Assign Heading 1 to the chapter title using the Styles pane.
- Do this for each chapter.
- For each subchapter, do the above but update Heading 2 and use it for each subchapter.
- Repeat for deeper subdivisions using the next level of headings if needed.
- In the Materials spreadsheet, if this is complete, in the Structured Text and PDF Tags column, enter "Y." If not, and this is infeasible to fix, enter "N."
- If the file is too small to require structured text, enter "n/a."
4.2.2.1.3 Color Contrast
- The Accessibility Checker has a space for contrast. Word often suggests a color and you can change it there directly.
- If this is complete, in the Materials spreadsheet, in the Contrast column, enter "Y." If not, and if the issue is infeasible for us to fix, enter "N."
Note: Slide Reading Order does not apply to Word. In the Materials spreadsheet, in the Slide Reading Order column, enter "n/a."
4.2.2.2 PowerPoint Documents
- Open the PowerPoint file.
- In the ribbon, click "File," "Info," "Check for Issues," "Check Accessibility."
4.2.2.2.1 Alternative Text
- The Accessibility Checker has a space for alternative text.
- If there are only a small amount of images without alternative text which can be described outside of subject matter expertise, ALG can fix this.
- If there are a large amount of images without alternative text, or if the image descriptions require subject matter expertise, ALG will need to contact the author.
- There is a way to mark alt text images as decorative using a checkbox. This is applicable only if the images do not convey any information to the reader.
- If this is complete in at least two of the files in a row (unless there is only one file), in the Materials spreadsheet, enter "Y." If not, enter "N." If there are no images, enter “n/a.”
4.2.2.2.2 Color Contrast
- The Accessibility Checker has a space for contrast.
- If all contrast is compliant, in the Materials spreadsheet, in the Contrast column, enter "Y." If not, enter "N." If there isn't a need for color contrast (like in a small black-and-white document), enter "n/a."
4.2.2.2.3 Slide Reading Order
- The Accessibility Checker has a space for document structure.
- If it's easy to fix a small issue with a missing title, we can fix this. If every slide is missing structure, check one more file. If that’s missing structure as well, assume all lack structure. This will need to be fixed by the project team.
- In the Materials spreadsheet, in the Slide Structure column, if compliant, enter "Y." If not, enter "N."
- To double-check the reading order, in the ribbon, click "View," then "Outline." Everything a screen reader should need aside from the image alt text should be present, including titles.
4.2.2.3 PDF Documents
- Open the PDF file.
- Be sure that the "All Tools" option is selected on the menu ribbon.
- On the left side bar, click "View More," "Prepare for Accessibility," "Check for Accessibility," "Start Checking."
- On the right side, a report will appear.
- Every time you want to open this accessibility checker, you will have to go through the process above again and re-analyze the file. When you enter another mode, such as Edit PDF or Bookmarks, you will have to re-open the checker.
4.2.2.3.1 Alternate Text
- In the "Alternate Text" tab, check the following:
- Figures Alternate Text: This is an immediate issue that must be addressed by the author unless:
- There are very few images and they're easily described by ALG Staff, such as "Photograph of a pine tree."
- There are captions under the images which provide sufficient information. Adding alternative text to descriptive captions will be interpreted by a screen reader as a duplicate description.
- In the Materials spreadsheet, if alternate text is complete or there are descriptive captions for each image, enter "Y." If not, enter "N." If there are no images, enter "n/a."
4.2.2.3.2 Reading Order and Tags
- In the "Document" tab, check and fix the following:
- Image-only PDF: This is an immediate issue. Contact the Project Lead to get a different version of the file, possibly a Word document that they have in their files.
- Tagged PDF: If failed, right-click and click "Fix." This should auto-tag the PDF. Check the tags to ensure these are accurate.
- Do any of the new tags apply to the chapters of the book? If so:
- Go to the bookmarks.
- Click the three dots, then "New bookmarks from structure."
- Select the kind of tag that applies to the chapters of the book. Bookmarks are now ready; check them for accuracy.
- If both of these are complete, in the Materials Spreadsheet, in the Structured Text or PDF Tags column, enter "Y." If not, enter "N." If the PDF is too small to need structured text, enter "n/a."
4.2.2.3.3 Color Contrast
- Contrast in a PDF usually requires a manual check. The Accessibility Checker will normally indicate this.
- Manual contrast assessment, due to a lack of automatic checking, will be subjective, but it should meet a luminance ratio of 4.5:1 between the background and text.
- In the Materials spreadsheet, if contrast is complete, enter "Y." If not, enter "N." If not applicable, enter "n/a."
4.2.2.3.4 Case-by-Case Issues
- Language: If failed, right-click and click "Fix." Select the appropriate language. This is a small technical issue that we don't have a space for in the Excel sheet because it's a very easy fix.
- Tables: Check the following:
- Headers: If failed, this is hard to fix. It's easier to reach out to the author and ask for a Word document in order to fix it from there.
- In the Materials spreadsheet, table accessibility issues aren't common enough for a column, but enter the issue in the Notes column.
4.2.2.4 Excel Documents
- Open the Excel file.
- In the ribbon, click "File," "Info," "Check for Issues," "Check Accessibility."
4.2.2.4.1 Color Contrast
- The Accessibility Checker has a space for contrast.
- If this is compliant, in the Materials spreadsheet, in the Contrast column, enter "Y." If not, enter "N." If there isn't a need for color contrast (like in a small black-and-white document), enter "n/a."
4.2.2.4.2 Case-by-case issues
- Excel files with images are exceedingly rare, but alt text should be provided if images are included.
- All tables should have a header row. If the sheet has all data in tabular format, this should be applied by default.
- For anything that doesn't apply in the accessibility columns, enter "n/a."
- For any special accessibility situations pertaining to Excel, enter these in the notes column.
4.2.2.5 Other File Types
The accessibility of other file types received and processed by Affordable Learning Georgia will be evaluated on a case-by-case basis. Some types of files cannot be realistically evaluated or remediated by ALG Staff due to capacity or field specialization.
4.2.2.5.1 Audio Files
- Transcripts of audio speech must be made available by the author.
4.2.2.5.2 Video Files
- If the videos are already provided as streaming, captions must be made available by the author.
- If the videos are files sent to us, transcripts must be made available by the author in order to generate accurate captions.
4.2.2.5.3 Individual Image Files
- Individual image files are usually, but not always, instructor-facing files to assist with LMS course design. Accessibility work would take place within the system. These do not normally need to be checked unless they are used as sources of information and would require alt text in the platform where they are used.
4.2.2.5.4 Code
- Raw code processed by software: Raw code files intended to be read by a software platform, such as a Common Cartridge file or code files in Python or R, cannot be assessed for accessibility by ALG staff due to capacity.
- Raw code as educational text: Code could alternatively be used as readable text within a course. In this case, code must follow WCAG 2.1 AA standards within that document’s capabilities.
4.2.2.6 Third-Party Resources
- Resources created by grantees but hosted outside of ALG cannot be remediated by ALG staff, and the ability for ALG staff to check accessibility on third-party resources can vary.
- For all third-party resources:
- In the "Accessibility Status" column, enter "Third-Party."
- For our checks on the four accessibility columns (Alt Text or Captions; Structured Text or PDF Tags; Contrast; Slide Structure), enter "n/a."
4.2.3 Accessibility Status and Follow-Up
4.2.3.1 Assigning Accessibility Status
- In the “Accessibility Status” column, ensure all reviewed resources have been correctly marked as “Compliant,” “Non-Compliant,” “Third-Party,” or “Report.”
- Ensure any notes on the accessibility issues, which will be used to email project teams, are included in the “Notes” section.
- If ALG has fixed the resource, describe that fix in the “Action” column.
- At the end of the checks, if a resource is marked “Compliant, “Third-Party,” or “Report,” proceed to the Materials Publication process.
4.2.3.2 Follow-Up With Project Team (if Non-Compliant)
Do not publish any resources marked as “Non-Compliant.” Instead, contact the project team. In an email to the Project Lead:
- Inform the lead that the resources are currently inaccessible due to (describe the issues).
- Use the Notes to clarify anything specific to the project.
- For example, if there are multiple file types, indicate where the issues are. If there is automatically-generated alternative text where only the image’s source URL is listed (common with Google Images), let them know what has to change about the alternative text.
- Ensure the team knows that we cannot publish their materials until they are made accessible.
- Provide a way to submit the resources, either through email or through a cloud storage location for larger files.
- When the resources are checked and the requirements are met, change the status appropriately and proceed to the Publishing Materials process.
5 Major Process: Sharing Open Materials
5.1 About: Understanding Metadata as a Concept
In the major process of publishing ALG materials, the metadata creation procedures are possibly the most important part of the process performed by ALG staff. Metadata work requires description and judgment calls. Taking place after the arrival of Final Reports for a grant round, metadata entry is one of the more time-intensive and energy-intensive periods of intermittent work for staff. Due to the major demand on staff time and the differences in metadata knowledge between librarians and non-librarians, this section starts with a narrative description of metadata and its significance in open education before moving to the actual procedures for metadata entry.
5.1.1 Metadata for Non-Librarians
Metadata theory resides largely within the library and information science subject knowledge domain in the cataloging subdomain, and not every ALG staff member has been a librarian trained in metadata theory and practice; instructional designers are not, by default, assumed to have metadata training at the level given through a library and information science degree.
Many metadata guides are intended for either library catalogers or web developers interested in search engine optimization (SEO): it can be very easy to get lost in software or business-specific details not needed within ALG. For non-librarian staff, a good primer for understanding metadata is the Indiana University LibGuide on understanding metadata. This guide uniquely explains metadata outside of the context of catalogs and various library standards and fits what an ALG Program Manager would need to know about metadata.
5.1.2 Metadata and OER
When a grant team creates an open textbook and ALG shares it with the world, we are sharing instructional data with the world. This data, if ALG shared it in a random location on the Web without any structure or description, could not be found by anyone looking for an open textbook in their subject area or for their course. Without a way for instructors, librarians, and instructional designers to find OER, sharing anything is meaningless; the objectives behind open education require resources to be shared and discovered.
Even a perfect search engine, which does not exist, using both algorithmic and/or language model processing and a lot of time and energy, would be inefficient at finding a random textbook file in a random location. The engine may be able to find a completely unstructured, undescribed document in a search across all existing files on the Web for materials, then read the entire text, then use various mechanisms to categorize it in the right subject, and then interpret which topics within the subject this covers, hopefully assigning possible courses which cover the topic. Ideally, this engine would have done this automatically in a full survey of the existing open web. In a best-case scenario at the time of writing this publication, this would be an inefficient and ineffective process. Human experts need to help with the discovery process, making this easier for everyone involved.
Metadata makes this discovery process, for both the searcher and the engine, far easier and more accurate. Metadata, at the highest level, is defined as “data about data.” In the case of an OER repository, the resources themselves (textbooks, lecture slides, etc.) are the data. OER metadata is everything that’s used to help people find OER, from the title of the resource to the authors to the subject areas. Metadata is the crucial structural intervention between the searcher and their results. Better metadata means better discoverability, and with a vast, open world of OER without one central home for all resources, discoverability is crucial.
5.1.3 Controlled Vocabulary
If you browse an open educational resources repository and see a subject area for “Biology” alongside a subject area for “Biological Sciences,” and these two areas contain different resources from each other, you have a controlled vocabulary issue. Because metadata is used primarily for discovery purposes, a standard structural aspect of any metadata schema is a controlled vocabulary that allows for accurate discovery.
Libraries need to make all of their resources discoverable, from the paper card catalogs to the digital indexes used today. Depending on the library’s type (academic, public, etc.) and their location (United States, Europe, etc.), librarians use controlled vocabularies within cataloging systems like the Library of Congress and Dewey Decimal systems. These are not created by companies or one institution, but by a large group of experts, usually coordinated by nonprofits and government entities. Creating and sustaining a cataloging system is a gigantic national and/or international effort: a cataloging system takes years of organizing, community-building, consensus-building, writing, revising, and updating. As the world changes, the cataloging system needs to change as well, and any changes to the system affect every library using that system.
Open education, due to its early stage of existence and the capacity of its experts, does not yet have the means to create a unified cataloging system or a set of controlled vocabularies, though they may if the field formalizes and expands. This means that open education organizations like ALG have had to create their own vocabularies for over a decade of program management. These vocabularies are then collected by various library-specific indexes and open online indexes (OASIS, MERLOT) as part of their data harvesting and collection processes. Differences in controlled vocabularies among OER repositories make creating and sustaining a centralized hub for all OER with reasonably usable and accurate discovery difficult. Further, if the lists change in their original locations, those lists need to change at the collector’s location as well. Changes in controlled vocabulary lists could lead to compatibility and interoperability issues down the line and need to be handled with care.
5.1.3.1 Controlled Vocabulary at ALG
In 2016, when ALG first implemented a repository, the staff created controlled vocabulary lists based on USG controlled lists: subjects pertain to the USG lists of academic disciplines which governs the creation of academic discipline advisory committees. This list comes from an authority within the organization, which should pertain to institutions within the organization easily. However, this is not a perfect source for controlled vocabulary: committees may change their names over time, and our controlled vocabulary needs to stay the same.
ALG had to create a list of materials types (such as open textbook, open course) at the start of the repository. This list came from simple brainstorming procedures. Without any type of a controlled vocabulary to reference, ALG had to hope that this list would continue to be relevant in the future and not have to change. Not only did the materials type point to the type of resource, but it also determined where a resource would be placed.
This makes our 2016 controlled vocabularies somewhat outdated by 2026 standards. At some point, this list may need to change, but as mentioned above, this is no small task: this change will need to be made across the repositories and across all entries for resources. The benefits of a change need to outweigh the amount of time required to manage it.
Some of the materials types in ALG Commons, as a result, are rarely used, while others (such as “Homework”) may seem a bit too informal for what’s placed there (essay assignments, projects). This is a result of a small team building a controlled vocabulary with no existing reference at the time; these lists can, and should, be changed in the future, and that effort will likely take some time, with consideration to external indexes, to complete and implement.
5.2 Process: Understanding ALG Metadata
5.2.1 Metadata Field Definitions
The columns in the Materials spreadsheet are the dimensions of each set of materials, which are organized as one fact, or row, per set. Most grant projects will have one set of materials, but a few will have two or more. Columns of the Materials spreadsheet map to specific metadata fields in our repositories. Processing Final Reports procedures will include populating many of the Excel fields in the Materials spreadsheet.
5.2.1.1 Grant (#)
Grant number. This is the leftmost column of the spreadsheet.
5.2.1.2 Manifold Materials Type
Textbook, Ancillary, Adoption, or Research. “n/a” or blank is used if materials are arriving late.
- Textbook: Comprehensive resource for the course divided into chapters and/or units.
- Ancillary: Materials supporting teaching the course, either partially covering the course content or supplementing an open text.
- Adoption: Only grant report materials, as this is a straight adoption of existing OER.
- Research: Research Report, the deliverable for a Research Grant.
5.2.1.3 Commons Location
Materials type formatted for the ALG Commons list of locations.
- textbook = Textbook
- ancillary = Ancillary
- collections = Adoption
- reports = Research
5.2.1.4 Manifold Subject
Subject areas as listed in the names of USG Academic Committees at the point of creating this list. Subjects are:
- Fine Arts
- Biological Sciences
- Business Administration, Management, and Economics
- Chemistry
- Communication
- Computer Science and Information Technology
- Criminal Justice and Law
- Education
- Engineering
- English
- First-Year Experience
- Geological Sciences and Geography
- Nursing and Health Sciences
- History
- Information Literacy
- Foreign Languages
- Mathematics
- Philosophy and Religion
- Physics and Astronomy
- Political Science
- Psychology
- Research Reports
5.2.1.5 Commons Subject
Subjects formatted for the ALG Commons lists of locations.
- arts = Fine Arts
- biology = Biological Sciences
- business = Business Administration, Management, and Economics
- chemistry = Chemistry
- communication = Communication
- compsci = Computer Science and Information Technology
- criminal = Criminal Justice and Law
- education = Education
- engineering = Engineering
- english = English
- fye = First-Year Experience
- geo = Geological Sciences and Geography
- health = Nursing and Health Sciences
- history = History
- infolit = Information Literacy
- languages = Foreign Languages
- mathematics = Mathematics
- philosophy = Philosophy and Religion
- physics = Physics and Astronomy
- polisci = Political Science
- psychology = Psychology
- research = Research Reports
5.2.1.6 Commons Material Type
A material type selected from a drop-down menu in ALG Commons. These are:
- Open Textbook: a comprehensive text for a course or a lab.
- Open Course: a set of multiple ancillary materials implemented within a course.
- Grants Collection: a set of grant documents from a project where no OER was created. These are now known as “adoptions;” this is an older term.
- Lecture Slides: a set of lecture slides in PowerPoint, LibreOffice, Keynote, or other formats.
- Homework: Assignments, practice questions, and at-home activities.
- Assessment: Tests or quizzes.
- Video: Audiovisual-only media.
- Audio: Audio-only media.
- Photograph/Image: Image-only media.
- Research Report: a report from a Research Grant.
Rarely-used options:
- Marketing Materials: Materials created for promoting OER within a particular project or on a particular campus.
- Article: Research article.
5.2.1.7 Commons Link
The link to the new materials in ALG Commons.
5.2.1.8 Institution
Institution name in full.
5.2.1.9 Project Lead
Project Lead for the grant project.
5.2.1.10 Lead Email
Email address for the Project Lead.
5.2.1.11 Team (#)
Team member other than the Project Lead, listed as Team 1, Team 2, etc.
5.2.1.12 Email (#)
Email address for the team member with the same number.
5.2.1.13 Materials Link
If the materials are hosted on the Web and not sent to us in a file, put the URL here.
5.2.1.14 Description
A description of the materials. This will be the main description for the entry in both repositories.
5.2.1.15 Title
The title of the materials. This will be the main title for the entry in both repositories.
5.2.1.16 Manifold Link
URL to the entries after they are added in Manifold (OpenALG). These are critical to provide the links when making the corresponding ALG Commons entry.
5.2.1.17 Course Title
Title of the course being addressed by the project and materials.
5.2.1.18 License
Creative Commons license type, assuming 4.0 unless otherwise indicated.
5.2.1.19 License Embed
Embed HTML used for each license. This is critical for OpenALG, as Manifold does not have a CC license drop-down and needs HTML instead.
- Embed codes are from the Creative Commons License Chooser.
- It's easier to complete the license Type column first, then get the embed HTML for one type of license, then copy/paste the HTML for all resources with that type of license.
5.2.1.20 Course Number
A standard identifier for the course, typically eight characters (e.g. ENGL 1102). Some course numbers stay the same across institutions, others vary.
5.2.1.21 Notes
Anything helpful to know beyond these columns about the materials. For example, if a continuous improvement grant's materials are a revision of materials that are already in the repository, put that here as a reminder that we don't need to create a new entry in the repository.
5.3 Process: Completing Metadata Fields in Excel
Once these columns are complete, this spreadsheet serves as the data structure that allows a smoother process for metadata entry in the two separate locations for it (Manifold/OpenALG, Digital Commons/ALG Commons).
Ignore the following column sections until a later point in the process following “Grant #”:
- Administrative Columns: “Received” through “PayReq”
- Accessibility Check: “Accessibility” through “Action”
- Grant Information: “PO” through “Email (last number)”
Fill in a project’s row for all columns not covered by formulas:
5.3.1 Grant (#)
While this is already populated by grant project, the Grant # is still important here, as it functions as the Row Index, meaning there should be no duplicates in this column.
When grant projects cover more than one course with more than one set of materials, we will need to copy the original grant row and make new rows with unique numbers. For these new rows, add a number to the end of the grant number to identify it as a unique set of materials:
- Check to see if the materials are for more than one course and are not one cohesive resource. For example, an Anatomy and Physiology I and II textbook covers both I and II. Sets of lecture slides for Introduction to Engineering and Engineering Statics are two different sets of materials for two separate courses.
- If there are materials for more than one course, you will have more than one entry in the repository. Create a row for each entry, and for the grant number naming:
- Use the format (Grant #)-1, (Grant #)-2, etc. (e.g. 774-1, 774-2, M241-1, M241-2).
- From there, treat each entry as a separate set of materials for those particular courses. Change the Grant Information (PO through Email (last number)) as appropriate before metadata entry.
Skip to the metadata column section (“Manifold Materials Type” through “Notes”) for the rest of this process.
5.3.2 Manifold Materials Type
A materials type selected from a drop-down menu for use in OpenALG.
- Textbook: Comprehensive resource for the course divided into chapters and/or units.
- Ancillary: Materials supporting teaching the course, either partially covering the course content or supplementing an open text.
- Adoption: Only grant report materials, as this is a straight adoption of existing OER.
- Research: Research Report, the deliverable for a Research Grant.
5.3.3 Commons Location
This is an if-then formula which takes the Materials Type and assigns it to a location in ALG Commons:
=IF([@[Materials Type]]="Textbook", "textbook", IF([@[Materials Type]]="Ancillary", "ancillary", IF([@[Materials Type]]="Adoption", "collections", IF([@[Materials Type]]="Research", "reports"))))
5.3.4 Commons Materials Type
A material type selected from a drop-down menu for use in ALG Commons. These are:
- Open Textbook: a comprehensive text for a course or a lab.
- Open Course: a set of multiple ancillary materials implemented within a course.
- Grants Collection: a set of grant documents from a project where no OER was created. These are now known as “adoptions;” this is an older term.
- Lecture Slides: a set of lecture slides in PowerPoint, LibreOffice, Keynote, or other formats.
- Homework: Assignments, practice questions, and at-home activities.
- Assessment: Tests or quizzes.
- Video: Audiovisual-only media.
- Audio: Audio-only media.
- Photograph/Image: Image-only media.
- Research Report: a report from a Research Grant.
Rarely-used options:
- Marketing Materials: Materials created for promoting OER within a particular project or on a particular campus.
- Article: Research article.
5.3.5 Manifold Subject
This is a drop-down where the Manifold Subject, as listed in the Definitions section, is entered.
5.3.6 Commons Subject
This is an if-then formula which takes the Manifold Subject and assigns it to a subject area in ALG Commons.
Note on the formula: This was once a drop-down box in the interest of not having a long formula, but the formula makes the Excel sheet itself more usable. This is long, so here’s a description of what the formula does:
“If the Manifold Subject is “(the first Manifold Subject from the Data Validation tab’s list)” enter “(the corresponding Commons Subject from the Data Validation tab’s list)”. If not, move on to the next Manifold Subject on the list. Go down the list until you find a match and enter the Commons Subject from the list. If none of these on the list apply, leave it blank.”
=SWITCH([@[Manifold Subject]],"Fine Arts","arts","Biological Sciences","biology","Business Administration, Management, and Economics","business","Chemistry","chemistry","Communication","communication","Computer Science and Information Technology","compsci","Criminal Justice and Law","criminal","Education","education","Engineering","engineering","English","english","First-Year Experience","fye","Geological Sciences and Geography","geo","Nursing and Health Sciences","health","History","history","Information Literacy","infolit","Foreign Languages","languages","Mathematics","mathematics","Philosophy and Religion","philosophy","Physics and Astronomy","physics","Political Science","polisci","Psychology","psychology","Research Reports","research")
5.3.6.1 Materials Link
If the materials are linked and not hosted locally, enter the URL here. If not, enter Local.
5.3.6.2 Title
- If the Type is an adoption, name it (Course Title) Adoption.
- If the Type is ancillary materials and it's a mix of ancillary types, name it (Course Title) Ancillary Materials unless the team gave us a particular title for the ancillaries.
- If the Type is ancillary materials and it's one type of ancillary, name it (Course Title) (Type of ancillary), e.g. Introduction to Biology Video Set unless the team gave us a particular title for the ancillaries.
- If the Type is a textbook, name it (Course Title) Open Textbook unless the team gave us a particular title. We will include any ancillaries within this entry.
5.3.6.3 Description
- If the team gives ALG an official description under the Responses tab, use this unless it doesn’t appropriately describe the materials. We are permitted to correct typographical and grammatical errors.
- If there is no official description and this is a textbook, put the following:
- This (Open Textbook / Set of Ancillary Materials / Other Type) for (Course Title) was created under an Affordable Materials Grant. Topics covered include (either list all chapters/topics if it's a short (6-7 items) list, or pick a selection of 6-7 chapters/topics).
5.3.6.4 License
- While ALG copies this from the Responses sheet, some teams will not understand what goes here. If this is not answered, check the materials for any license designations.
- We mark CC BY 4.0 International by default if no license is present, but it is still good to check with the team just in case.
- Ensure that no open licenses use the 2.0 (much older) versions of the license. These versions had a legal issue due to an automatic license termination feature.
5.3.6.5 License Embed
Do this after the License column is complete for all materials – this makes a more bulk-oriented procedure easier.
- Before filling this in, sort the rows by the License column.
- Copy the embed code from the Creative Commons License Chooser.
- Select "I know the license I need." for 1, then select the license for 2. The embed code is on the right-hand side of the page under "Mark Your Work." Select the HTML tab and click Copy.
- Paste the embed code in the License Embed cells for projects with that license.
- Once finished, change the sort back to Grant #.
5.3.6.6 Manifold Link
Enter the URL for the Manifold entry once created using the procedures in 3.3.4 “Creating OpenALG Entries.”
5.3.6.7 Commons Link
Enter the URL for the ALG Commons entry once created using the procedures in 3.3.5 “Creating ALG Commons Entries.”
5.3.6.8 Commons Material Type
This is a dropdown using Data Validation. Some of these material types are a bit outdated - pick the best fit for the materials you have.
- Leave the Manifold and Commons links columns blank.
- Leave the Notes column blank unless you have particular notes to share within ALG about that project.
5.3.6.9 Notes
Enter any helpful notes here. For example, if a continuous improvement grant's materials are a revision of materials that are already in the repository, put that here as a reminder that we don't need to create a new entry in the repository.
5.4 Process: Publishing Materials
Note: Do not create new entries and upload files for sets of files which were found NonCompliant in the Accessibility Check. Reach out to the team members for fixes and proceed when compliant with accessibility criteria.
- Login to OpenALG with your account and click "Enter Admin Mode."
- Click "Add a New Project."
- Enter the Title.
- Enter the Subtitle, which is the institution's name.
- Enter the description under "Brief Description."
- The multiple-choice questions which follow Brief Description are not important unless one grant project has multiple texts.
- Click "Save and Continue."
- On the sidebar, click "Properties."
- Enter the publication date as the 1st of the month of the grant deadline. For Spring 2023, enter May 1st.
- Check the slug, which will be the last part of the project's URL.
- If it's too long, edit the slug to be short and descriptive.
- When in doubt, include the course name and institution. If the course name is too long, use the course number.
- Do not edit this after you have published and announced that this project exists; otherwise, you'll wind up with error messages from sites linking to this work.
- Upload the appropriate thumbnail from ALG / Images and Marketing / ALG Logos / OpenALG: textbook-square, ancillary-square, or adoption-square.
- Search for the subject area and add it from the drop-down list.
- Add a tag for:
- The course title, like "English Composition I"
- The course number, like "ENGL 1102."
- Avoid letters after the four numbers if possible - BIOL 1101L is a lab that corresponds to BIOL 1101, and grouping all of these together is helpful.
- Normally, you would wait until the end of the entry to turn Draft Mode off, but for efficiency's sake, it's probably fine to turn it off now and not have to return to the page.
- Click "Save Project."
At this point, if you have a Manifold Text to be ingested into Manifold and made into a web-readable text, we'll work on that process separately on a case-by-case basis. If you don't have a Manifold Text, proceed to the next step. If you do, complete this step and then move on.
- On the sidebar, click "Layout."
- Click the "Calls to Action" box.
- For the main materials, add a Left-Side Button.
- Add the file if this is a download, or add the link if this is a link. Label the link according to the action: "View Open Course," "Download Lecture Slides."
- Click "Save."
- If there are more main materials (for example, a lecture slide set and a homework set), add more than one left button.
- Add other supplementary files as Left-Side Links. These are smaller and will appear below the buttons.
- Add a Right-Side Link.
- Add the ZIP folder with the grant number as the title here as a download.
- Call the link "Grant Docs."
- Click "Save."
- Click "Description and Images" in the top box.
- Add the cover image from ALG / Images and Marketing / ALG Logos / OpenALG: textbook, ancillary, or adoption.
- Click "Save."
- On the sidebar, click "People."
- Add the names of all team members. Select the role "Author" for each.
- If someone specifies that they are not an author, select the role "Contributor" for these.
- If this is a new name, an error message will display with a link to create a new Maker. Click that link and create the new maker, then return to the previous page.
- Alternatively, in the top navigation bar under Records, select “Makers” and add all makers by first and last name to Manifold first.
- In the case that a grantee wants to be added as an editor to the project to edit it in the future directly in Manifold, the "Access" section is how we do that.
Note: Manifold texts and rarely-occurring collections of file-based photos or videos can be added using Resources and Resource Collections. These are treated on a case-by-case basis using the ingestion tool and edits in both Word and Markdown. If you do not need the Resources feature, proceed to the next step. If you do, complete the Manifold text or resource processes and then move on.
- On the left sidebar, click "Metadata."
- Copy and paste the CC License Embed under "Rights." This will allow it to display on the project's front page.
- Check the entry using "View" in the top box.
- Copy the new URL and in the Materials file, enter it in the Manifold Link column.
5.4.1 Creating ALG Commons Entries
Note: Do not create new entries and upload files for sets of files which were found NonCompliant in the Accessibility Check. Reach out to the team members for fixes and proceed when compliant with accessibility criteria.
- Log into ALG Commons using the "My Account" link at the top of the page.
- Click the "oer.galileo.usg.edu" link to enter admin mode.
- Select the ALG Commons Location as listed on the Excel spreadsheet.
- Click "Upload Book" or "Upload Submission" on the top bar. This varies by whether or not the collection is a book collection (Textbooks, Adoptions) or another materials collection (Ancillary, Research).
- Enter the metadata as listed in the Excel spreadsheet:
- Author: The author listings include email in ALG Commons. Email addresses are important as they'll inform the team members that their materials are now up and running.
- Author listings include an auto-complete of sorts for the institution field. This field isn't up to date for every institution.
- Publication Date: Use Season and Year for the semester Publication Date.
- Keywords: Keywords are very basic descriptive terms: "statistics," "theatre appreciation." Only do one or two of these.
- Disciplines: Select the most appropriate discipline from the Disciplines list. Calculus doesn't have one, so Mathematics is fine. Computer Sciences are in Physical Sciences and Mathematics. Psychology and Sociology are in Social and Behavioral Sciences. This is just for a supplementary discovery tool; use your best judgment.
- Author: The author listings include email in ALG Commons. Email addresses are important as they'll inform the team members that their materials are now up and running.
- CC License: Select the CC license from a drop-down menu; no need to use embed codes.
- Link: Select "Link out to file on remote site" to get to the Manifold instance - this is where we're now keeping files themselves.
- No need to add additional files; do not check the additional files box.
- If there's a space for a cover image (for textbooks and adoptions), use this folder path: "ALG / Images and Marketing / ALG Logos / OpenALG / covertextbooks or coveradoptions."
- After that, click "Submit." Be sure to click the publish button near the bottom of this page; otherwise, the resource will sit on the server unpublished.
6 Major Process: Data Management
6.1 Data Management Context
Specific context of Affordable Learning Georgia's position and funding is required to understand why ALG performs its regular data processes; otherwise, a reader may interpret this section as an arbitrary set of complicated steps, or counting for counting's sake. All hours spent on data work are crucial to Affordable Learning Georgia's existence, sustainability, and awareness throughout the USG.
6.1.1 2014-2016: One-Year Funding and Reporting in the First Two Years
Affordable Learning Georgia is currently a program supported by state funds within the annual USG budget through GALILEO. This was not always how ALG funding was structured. After 2013's unfunded pilot year of development work between GALILEO, librarians, and instructional designers, the plans and structures behind ALG were ready for a funding proposal to the State of Georgia through the USG. In 2014, GALILEO received state funds through the USG to establish and run ALG as one-year program funding from the Georgia legislature. State program funding is a specific type of funding that comes with a mission: the ALG mission was to reduce the cost of textbooks for students and enhance discovery within GALILEO. That mission required accountability, and accountability required reporting. The GALILEO side of this program funding was used to enhance the discovery of no-cost library electronic resources through the establishment of a new unified Integrated Library System (ILS). Reporting on that project's progress was necessary and completed by GALILEO.
6.1.1.1 2014-2015: Year One Report
ALG had a unique data issue: in one year, the program needed to see open, no-cost, and low-cost implementation at its institutions, while the USG operated within an academic freedom framework surrounding how faculty taught and what materials they used. ALG would need to catalyze and support what faculty decided to use. Transformation Grants were ALG's decision, connecting system-level support and faculty-driven peer review with faculty-level decision-making. Awareness needed to be raised and then funding needed to be provided for projects with one semester of implementation work (Fall) and one semester of teaching with the materials (Spring) with a final report at the end of this semester. In a way, the ALG grant model, which became a model for emerging OER programs throughout the United States, took form strictly within its own one-year funding constraints.
Further, the mission to reduce the cost of textbooks for students came with it a unique data accountability need: exactly how much would an implementation of materials save students? These projects ended after funding would have to be requested for the next fiscal year; ALG would be unable to confirm that semester's savings in their report. Instead, ALG asked for the average number of students affected in one year, the cost per student of materials being replaced, and any (low) costs per student associated with the new materials. ALG reported the potential one-year impact of student savings in their first report.
6.1.1.2 2015-2016: Year Two Report
Funding during the second year of the program consisted of legislative one-year program funding once again. ALG continued and expanded its grant program under the constraints of one additional year of funding, building on the success of the first round of grants. For legislative reporting, the system office determined that an update to the first year's report would not suffice for the second year, as the first year's grantees were one year past implementation, theoretically continuing student savings for three more semesters.
ALG would have to provide not only annual savings numbers, but savings by each common USG semester - summer, fall, and spring, in that order, as one fiscal year. ALG did not have this data from their grantees; only annual estimates per the needs of the first year of the program. While ALG asked grantees for this data in the second year, there was no requirement to respond to this question in the original grant agreements. In the interest of transparency and simplicity, ALG separated any non-responding grant teams' original annual estimates into even thirds per summer, fall, and spring. While this was not a perfect estimate, as summer semesters were quite different from fall and spring, this was an estimate that could be logically explained given the circumstances as a best guess.
The data was not easy to normalize alongside the OER efforts in a partnership with a systemwide online core curriculum program, eCore, which implemented OER in partnership with ALG and provided their own data once per year, based on exact eCore enrollment counts, for a final report. These numbers were kept separate in ALG reporting.
6.1.2 2017-2018: Improving Data Procedures in Year Three
With ALG being funded through GALILEO as a regular part of the budget in year three onward, ALG could start looking at longer-term plans, including an improvement of the data collected through the grant program. ALG reached out to the USG Fiscal Affairs division and worked with the director of the Budget Office to employ more conservative and rigorous data collection and processing. The result was a series of improvements: - Collecting semester-by-semester estimates for the number of students directly affected by a Transformation Grant project from all grant teams. - A yearly survey to check if prior grantees were still using open, no-cost, or low-cost materials. - Formulas in our spreadsheets to account for the yearly surveys in savings estimates: if teams reported continuation, grants data was used for savings estimates. If teams reported discontinuation or did not report back, grants data was zeroed out.
The yearly survey, now called the Sustainability Check (on our end) or Sustainability Survey (on respondents' end), became a way for ALG to provide a conservative estimate for student savings per year: non-responding grant project leads could still be using open materials and saving students money, but because it wasn't reported, we did not report their data as savings during that year. We could say to executive leadership that savings were at least as much as the data we provided.
6.1.2.1 Challenges Past Year Three
While this worked well for gathering numbers and providing our sources in an annual report of impact in year three of the program, ALG would now be comparing data year-on-year for an analysis of the achievement of measurable objectives. If ALG wanted to improve its outreach and increase annual student savings over the past year, an accurate picture of the sustainability of the previous years' savings would be necessary.
ALG's conservative estimate from their work with the Budget Office was safe to report to state legislature and executive leadership, but the zeroing-out of non-responders to a survey would diminish annual student savings numbers year by year. Non-responder counts grew as the pool of previous grant projects grew; no volunteer survey provider expects a 100% response rate among a large number of potential respondents. In the meantime, project leads may not even be receiving the survey in the first place: the survey is sent through email to hundreds of project leads, and that email comes from the Program Director - not a position where that email would immediately be recognized as important in one's institutional inbox. Further, Project Leads may have left for another institution while the department kept open or no-cost materials in place, or the change may have led to the implementation of commercial textbooks, or even the revision of open materials within their own learning management system without any contact with ALG.
These survey challenges persist to this day. ALG cannot enforce a requirement on survey responses after the funding period has ended, nor can they stay abreast of all faculty turnover at 25 USG institutions and know if someone took the project lead's place in sustaining the project, if that project continued, from the turnover information.
6.1.3 2018-2019: Implementing Course Section Markings
Markings in a course schedule for which sections use no-cost and low-cost materials seemed like the future of impact tracking for institution-wide and system-wide data: - Each course section is required to have its own data entry. - Course section markings could appear in the official course schedules, informing students of no-cost and low-cost materials options. - A system with one unified course scheduling platform could export uniform data from each institution, making the no-cost and low-cost materials data from each institution interoperable.
Theoretically, accurate course markings could provide a solution to the Sustainability Survey problem with the third year's data model. ALG would be able to check each grant project against its own course section data and verify either continuation or discontinuation; at least from the perspective of teaching with either no-cost or low-cost materials.
With support from the Chancellor and the GeorgiaBEST team overseeing Banner, the systemwide platform for registration and course schedules, Affordable Learning Georgia was able to establish a set of standards for two course section markings: ZNCM and ZLCM:
- Z: System-wide marking indicator
- NC: No-cost
- LC: Low-cost
- M: Materials
Administrative standards for these markings are located on the ALG Banner Attributes page. Technical standards involve Banner screenshots and technical information, which is considered confidential by the vendor and behind an authentication barrier for employees working within Banner only.
6.1.3.1 Course Section Marking Issues
Unfortunately, the benefits of a required set of course markings had unforeseen issues: - Departments within most institutions had different methods of reporting their course sections to the Registrar for Banner entry. Many of these methods included a spreadsheet, but each spreadsheet was unique to the department. - Responsibilities for reporting course markings fell on different positions in each department or institution: - Administrative assistants for departments often assisted with section reporting. - Some reporting responsibilities were given to campus stores per their other reporting responsibilities in the Higher Education Opportunity Act (HEOA). Campus stores can only guarantee knowledge of a textbook adoption if the textbook is sold through the campus store or otherwise indicated within their systems by individual faculty members who took the initiative to report it. - Registrar's Offices were usually the final point of control for Banner data, but they received this data from departments or stores or both. Registrars would then have to clean the data before entering it in Banner. The best a Registrar's Office could do is provide an accurate depiction of what had been sent to their office; they could not ensure the accuracy of what they received.
Institutions were subsequently given the opportunity to allow faculty to suggest changes in Banner to their course sections, giving faculty the ability to mark their own courses as ZNCM or ZLCM after the approval of a Banner administrator. The optional nature of this solution to institutions, along with the burden of raising awareness to all faculty members not only about the course markings but how to use Banner, detracted from the efforts benefiting any data accuracy.
6.1.3.2 Defining the Major Problem and Current State
The overall issues with ZNCM and ZLCM course markings in the USG could be defined this way: - Accurate data would mean a 100% response rate from the people who know which materials are being used in which course sections. - Unless a department has made a department-wide selection of a textbook for a course, the only people who know which materials are being used in course sections are the faculty instructors assigned to those sections and the students who then pay, or do not pay, for the materials. - Faculty instructors in course sections can change abruptly, and part-time/adjunct instructors can often be assigned to a course section very quickly before the start of a semester. - These late additions and late changes cannot be accounted for in course schedule data, which is reported earlier in order for students to be able to register. Marking a section as no-cost, only to have a new instructor require a high-cost textbook, would be a surprise for any student registering for the no-cost-marked section. - Even if faculty instructors would like to report their no-cost or low-cost course sections, and that reporting is essentially voluntary, the ways in which departments report their sections varies. - With the previous points in mind, the only way to ensure accuracy would be: - An easy and uniform way for faculty to report ZNCM and ZLCM for each section, then: - A requirement for all faculty to accurately report ZNCM, ZLCM, or neither - This solution, due to ALG's position in the system office, an inclination towards avoiding mandates, and a lack of tools to make this process usable for all faculty involved, is not possible.
Therefore, the current state of accuracy management in Banner data collection is that institutions are left to find their own individual organizational solutions to increase, but not perfect, their individual data collection process. ALG has met with multiple institutional teams on improving their data collection; every institution's data collection situation differs, and both the solutions and the people involved with those solutions will differ by institution.
6.1.3.3 Banner Markings in ALG Reports
Until ZNCM and ZLCM data achieve a higher level of accuracy, ALG cannot rely on this data to validate the continuation of grant projects and their savings. Instead, ALG continues the Sustainability Survey and continues to provide the conservative estimate which includes zeroing-out non-respondents. Instead, ZNCM and ZLCM data is used in a comparative manner in reporting: the most important indicators of how institutions are doing in both making course materials affordable and reporting that affordability are the percentages of how many course sections are marked ZNCM or ZLCM compared to all course sections at the institution, excluding sections in the catalog from eCore. This at least somewhat accounts for variations in the number of course sections and course size at institutions.
This does not, however, account for the differences in the courses offered - for example, there are far more OER available for most introductory mathematics courses than graduate-level ones. Some institutions do not offer graduate-level courses, or offer very few, so they are less a proportion of the total course sections than at a larger research or comprehensive institution. Institutions are therefore compared to both the systemwide average and the average of their USG institution's classification (research, comprehensive, state university, state college). Institutions are also ranked in two lists in the report: one for grant savings, one for ZNCM and ZLCM as a combined percentage of all sections.
6.1.4 2018-2023: Excel Sheet Limitations
Banner data from every institution is provided in one spreadsheet per semester by the Research and Policy Analysis (RPA) office with an "X" in the ZNCM or ZLCM column if the marking has been selected for the course section. When ALG began to receive Banner data from RPA, there was a quick realization that Banner data and ALG grant savings data were entirely separate from each other and tough to connect in order to answer complex questions. The Banner data for just one semester was so vast that an Excel formula connecting the data to a grants spreadsheet was infeasible- Excel buckled under the pressure of the demands we had for connecting this data together. Adding all the Banner rows for one semester to a master Banner data spreadsheet also went beyond the limits of Excel. Both of these limitations meant that Banner data analysis meant first analyzing one semester's spreadsheet, then adding various totals to an ever-growing second spreadsheet.
The grants data was also too big for its Excel container, growing beyond its 2016 model with far too many columns: with the grant as the row, established in Round 1 during our first year, one column had to be appended for every semester's students affected and every semester's student savings. These needed to be connected, through formulas, to the Sustainability Survey in order to be zeroed out if the project was discontinued or had no response. Looking at the spreadsheet provided no answers to readers: a separate Excel tab was used just to carry the result of these formulas to a readable place - doing so required using more formulas. Even without taking RPA Banner data into account, ALG was breaking the computational limits of Excel.
When data gets too big for Excel, the traditional solution is to move this data into a formal database. We had, in essence, two databases: one with Banner data for each semester in their own spreadsheets, and one with each grant and its savings for each semester, but their separate nature and massive size stopped cross-spreadsheet formulas from being viable after a couple years: we only had individual Excel files to work with when answering questions. Enlisting IT to convert all of these to a database on a hosted server, complete with the semester-by-semester maintenance of that data, was impossible from ALG's position and capacity. Through a Certified Public Manager course with the Carl Vinson Institute of Government (CVIOG) at the University of Georgia (UGA), the ALG Program Director learned more about data management and further partnered with CVIOG to find a solution.
6.1.5 2024-Present: The ALG Data Model
The agreed-upon solution was a tool within the USG's subscribed Microsoft Office applications: Power BI. Using a data management program called PowerQuery, Power BI enabled Excel spreadsheets to be converted into what's essentially a database within an Office application. While the first attempts at making reports and dashboards from Power BI involved one spreadsheet at a time, CVIOG proposed a different solution: using a folder, holding spreadsheets for each semester's data, to serve as a database by using an auto-join function when new Excel sheets, cleaned to be uniform with the rest of the sheets, were added to the folder. Through Power BI and PowerQuery, ALG was able to connect columns of identical data in two different spreadsheets, or a column in one spreadsheet containing some data from a column in another spreadsheet, to each other, meaning the application had a way to connect our grants information with our Banner information. The resulting data model is ALG's current, more flexible method of collecting, cleaning, analyzing, and reporting data to all stakeholders across the system.
Grants data was similarly made "tall" with multiple savings spreadsheets instead of its original "wide" one-spreadsheet, columns-per-semester organization. In a new folder, each semester has a new spreadsheet with all grants listed by their unique grant number in one row each. That semester's savings data, controlling for the Sustainability Survey, is entered in to the spreadsheet. The information about the grant, from the team members to the subjects and courses covered, resides in one spreadsheet outside of that folder. That spreadsheet is also organized by the unique grant number; those two columns are connected in Power BI.
Time is organized in a separate spreadsheet. Semesters are hard to put into order: summer 2015 comes before fall 2015, but summer 2016 would be under summer 2015 in an alphabetical list. A separate spreadsheet provides a Semester Index with a simple number for each semester. The Grant Information spreadsheet contains the final semester (and the first semester of teaching) of a project, and the index number for the semester is entered into each grant savings spreadsheet. Putting all data in chronological order therefore means using the semester index column, connected across all sheets in the model through either the index itself or the spelled-out semester included in the semester spreadsheet, to sort. The semester index spreadsheet also includes other information about the semester - the calendar year, the fiscal year, and an approximate date resembling the start of each semester. Data from across all of these sheets can then be arranged and filtered by calendar and fiscal years. In the event anything needs to be sorted by date, this data model is compatible.
6.1.6 Connecting Data Management Context to Procedures
The data processes listed in this SOP are all related to the resulting data model, itself a result of years of improvements and collaborations on data collection, all of which started under very different funding circumstances and a minimal knowledge of data practices in 2014: ten years before the new data model was established. Remnants of ALG's first steps will likely linger in these procedures; this section, or any section of the SOP for that matter, should not be seen as the best-case scenario for a modern data model. The SOP describes the procedural scenario that ALG is currently operating within.
Various Excel spreadsheets are used in these procedures for tracking ALG data, and the procedures listed are dependent on how these spreadsheets are used within Power BI and the resulting data model. Procedures or entire processes may or may not change at ALG in the future; Power BI may change, the USG may discontinue a subscription, and there are promising open-source developments using the programming language R through the Tidyverse package in reporting through Quarto, a way to make a document based in R and Markdown into a dynamic, customizable report with easy export capabilities. ALG's exploration of the R language and how Markdown worked within Quarto led to using Markdown as the lead format for this SOP.
Data procedures may come across as complex to an SOP reader, and they do take a considerable amount of time to perform, but as stated in the Purpose section of this SOP, complex procedures at ALG are in service of simplicity for our stakeholders. While ALG staff are always seeking out more efficient ways to effectively meet our objectives, ALG staff are not interested in decreasing the organization's helpfulness in the interest of this efficiency. Clear, timely, and accurate reporting in simple, welcoming formats for stakeholders requires data knowledge, technical knowledge, and diligence for ALG staff.
6.2 Recurrence of Data Processes
Data processes recur at various intervals throughout a fiscal year: - Grants Data Entry Per Semester: Three per year; summer, fall, and spring semesters - Banner Data Entry Per Semester: Three per year; summer, fall, and spring semesters - Provost Reports: Two per year; fall and spring semesters - Sustainability Check: Annual
6.3 Process: Entering Grants Data Per Semester
6.3.1 Check previous data.
6.3.1.1 Ensure the current Sustainability Check is up to date.
- The new check should be finalized at the end of the calendar year / beginning of the next calendar year.
- See Sustainability Check Methods for details.
6.3.2 Create a new data sheet.
6.3.2.1 Copy a previous savings data sheet (most recent) as the template.
- Start the new sheet’s filename with the new semester index number, then the season, then the year.
- For example: “029 Summer 2024.”
- This keeps everything in chronological order while still making the semester clear.
6.3.3 Enter data from existing spreadsheets.
6.3.3.1 Copy and paste the appropriate data from the GrantInfo spreadsheet.
Helpful Shortcut: Ctrl-Space selects an entire table column.
- In the Data folder, open the GrantInfo spreadsheet.
- Filter Type by Transformation only.
- Filter Savings Start to not include future semesters past your savings data sheet’s semester.
- For example, you’re making Summer 2024 in Spring 2025. Uncheck all semesters after Summer 2024.
- In the semester’s savings data sheet copy and paste the newly filtered columns from GrantInfo:
- Grant #
- Type
- Savings Start
- In the same savings data sheet, insert three new columns made to copy and paste the corresponding filtered GrantInfo columns:
- Students Summer
- Sustainability Check
- Savings Per Student
- Ensure that your Students Summer column rounds to the nearest decimal point. No half-students or third-students.
- Once everything is copied, in GrantInfo, remove the filters and save the sheet.
Note: The reason why the last three columns need to be added as new columns in a copy of the previous sheet is because they’re part of the calculations but not part of the final table needed for Power BI. We’ll be deleting them later.
Helpful Shortcut: For any time where you need to take one cell and paste it for the entire table column, whether values or formulas, there's a quick way to do this: Hit Ctrl-C to copy that cell. Immediately hit Ctrl-Space to select the entire table column. Immediately hit Ctrl-V to paste the new value and it's complete. There's also a procedurally-created icon which may allow you to fill the entire column upon creating the formula: either way is faster and more wrist-friendly than clicking and dragging through a column with the mouse.
6.3.3.2 Enter new values for the current semester.
- Change the Savings Semester column to the current semester. This is for Power BI purposes only.
- Change the Semester Index column to the current semester index number. This is for Power BI purposes only.
6.3.4 Enter all calculations.
- In the “Students” column, in the first non-header cell, enter the formula =IF([@[Sustainability Check]]="Continued",[@[Students Summer]],0)
- Be sure to change “Students Summer” to whichever semester you’re using.
- Copy this formula through the whole column.
- Use the Ctrl-C, Ctrl-Space, Ctrl-V shortcut from above.
- Ensure that all Continued-marked projects have a student count and not 0 to check if the formula is working. (Unless, in very weird cases like a multi-institution project, the students affected are 0.)
- Reformat the formulas as values only.
- You can copy the column and then right-click and paste using the “123” button to paste values, but here’s a fast shortcut: Ctrl-Space, Ctrl-C, Ctrl-Shift-V to select column, copy column, and paste values only.
- In the “Savings” column, in the first non-header cell, enter the formula: =[@Students]*[@Savings Per Student]
- Copy this formula through the whole column.
- Ctrl-C, Ctrl-Space, Ctrl-V as the shortcut.
- Reformat the formulas as values only.
- Ctrl-Space, Ctrl-C, Ctrl-Shift-V as the shortcut.
6.3.5 Reformat the spreadsheet to prepare it for Power BI.
6.3.5.1 Normalize the columns with your previous sheets.
- Delete these columns:
- Students Summer
- Sustainability Check
- Savings Per Student
- The order of columns should be:
- Grant #
- Type
- Savings Start
- Students
- Savings
- Savings Semester
- Semester Index
- Save the file and close the spreadsheet.
Note: Any name changes to the header row of these columns will change the name of the column when Power BI tries to process it. This will create errors until that header row is back to normal.
6.3.6 Refresh all Power BI reports linked to Savings Data.
6.3.6.1 Refresh the data in each Power BI report file.
In the Data folder, open all Power BI dashboards which use RPA Data:
- Institution Dashboard
- Provost Dashboard
- Public Dashboard
For each report:
- Click “Transform Data” on the top ribbon.
- On the ribbon, under “Refresh Preview,” click the arrow and click “Refresh All.”
- Check for errors.
- On the ribbon, click “Close & Apply.”
- You should now be in the report section.
- Check to see if the current semester is in the title (like in Public Dashboard).
- Manually type in the latest semester in the title or subtitle if necessary.
- On the ribbon, click “Refresh.” This should take a while.
- Once the refresh completes, click “File,” “Publish,” “Publish to Power BI.”
- Power BI will ask if you want to replace the current file in the workspace - yes, it’s fine.
Once this is complete, move to the next report until all reports are updated.
6.3.6.2 Ensure Power BI reports are updated in the web app and website.
- Open the Power BI Web App and open one of the newly-updated files.
- For each, click “Edit,” and then click “Save” to ensure they’re up to date on the ALG website.
- Check to ensure the reports are updated on the ALG website.
- In the website admin, change the featured numbers according to the new totals in the Public and Institution Dashboards.
6.4 Process: Entering Banner Data Per Semester
6.4.1 Format Banner Data for Power BI Reports.
6.4.1.1 Receive Banner data.
- New Banner data arrives by email after the end of the previous semester:
- Summer: Late August
- Fall: Late January
- Spring: Early June
- Download the spreadsheet.
- Move the file to the Data > RPA Data folder.
6.4.1.2 Format the spreadsheet.
- Open the new Banner data spreadsheet.
- In the lower tab, rename the first “srpt” title to “Data.”
- Format this sheet as a table.
- Create the Semester Column.
- Insert a column before SETID as Column A. Name this “Semester.”
- Format this column as Text.
- Enter the current semester in the first cell, e.g. “Fall 2024.”
- Copy and paste that cell down the column: Ctrl-C, Ctrl-Space, Ctrl-V.
- Delete columns to keep things uniform.
- “TOTALCREDITHOURS”
- Every “Z” column to the right of ZLCM and ZNCM.
- Ensure the spreadsheet is saved and close the file.
In the future, we might want to go back to the original RPA Data sheets in their shared drive to retroactively add a blank column for all these deleted columns. We may be able to discern how likely a ZNCM or ZLCM course is to be aligned with another HIP or with credit hour totals
6.4.1.3 Ensure the Semester Index is up to date.
- In the Data folder, open the SemesterInfo spreadsheet.
- Ensure that the current semester is included. Add semesters if needed.
- It’s okay to add a couple more years of semesters in here to make sure everything’s up to date.
- For the date column, use a standard date for each season.
- Spring = 1/1
- Summer = 6/1
- Fall = 9/1
- Save and close the spreadsheet.
6.4.2 Refresh all Power BI reports linked to Banner data.
6.4.2.1 Refresh the data in each Power BI report file.
In the Data folder, open all Power BI dashboards which use RPA Data:
- Institution Dashboard
- Provost Dashboard
- Public Dashboard
For each report:
- Click “Transform Data” on the top ribbon.
- On the ribbon, under “Refresh Preview,” click the arrow and click “Refresh All.”
- Check for errors.
- On the ribbon, click “Close & Apply.”
- You should now be in the report section.
- Manually type in the latest semester in the title or subtitle if necessary.
- On the ribbon, click “Refresh.” This should take a while.
- Once the refresh completes, click “File,” “Publish,” “Publish to Power BI.”
- Power BI will ask if you want to replace the current file in the workspace - yes, it’s fine.
Once this is complete, move to the next report until all reports are updated.
6.4.2.2 Ensure Power BI reports are updated in the web app and website.
- Open the Power BI Web App and open one of the newly-updated files.
- For each, click “Edit,” and then click “Save” to ensure they’re up to date on the ALG website.
- Check to ensure the reports are updated on the ALG website.
- In the website admin, change the featured numbers according to the new totals in the Public and Institution Dashboards.
If you entered the fall or spring semester data, proceed to create and send Provost Reports. Do not create and send Provost Reports for summer data.
6.5 Process: Creating and Sending Provost Reports
6.5.1 Create updated Provost Reports in Power BI.
6.5.1.1 Update the Power BI report file.
- Open the Provost Dashboard in the Data folder.
- Hit “Refresh” to ensure the dashboard is current.
- If you get an error that says the data needs to be “rebuilt,” go in the ribbon to Home > Transform Data and then Refresh > Refresh All. Close and apply. It should work from there.
- Sometimes this freezes. Going to Transform Data and hitting “Refresh All” should work.
- Check your data filters to ensure data is being displayed correctly.
- ABAC, being the first institution on the list, will display as the institution when "All" (systemwide data) is selected in filters. Check that the institution filter is working properly.
- Ensure that appropriate eCore filters are in place for all course marking reports.
- On the following tabs, change the “Filters on this page” “Semester” filter to the current semester and change the title at the top of the page accordingly:
- Courses without Markings
- Courses with Markings
- Rankings by Institution
- On the first tab, “Cover,” manually change the semester to the current semester and change the created date to the current date.
6.5.1.2 Print out the PDF reports.
- On the “Totals” tab, change the Institution slicer to the first one on the list.
- Change the Classification slicer to the only one that appears on the list.
- This is a necessary extra step; there is no way for the slicer to update itself.
- Click “File,” “Export,” “Export to PDF.”
- Save the document: ALGReport-[institution abbreviation]-[semester]
- Example: “ALGReport-ABAC-Spring2024”
6.5.2 Prepare a new Mail Merge spreadsheet.
6.5.2.1 Create a new Mail Merge spreadsheet.
- In the Data > Provost Reports > Mail Merge folder, copy the previous semester’s spreadsheet as a template.
- Rename the copy's filename to the current semester.
6.5.2.2 Export and copy the institutional percentages from Power BI into the spreadsheet.
- In “Provost Dashboard,” in the “Rankings by Institution” tab, click on the “No-Cost and Low-Cost Section Percentages (Non-eCore) table.
- In the right-top corner, click on the three dots.
- Click “Export Data.”
- Save the CSV file in the Mail Merge folder. Rename it NCLC_[Semester].
- Keep this Provost Dashboard open for further steps.
- In the Mail Merge folder, open the CSV file, click “Insert,” then “Table,” and make a table out of the data.
- Sort the list by institution A-Z.
- Open the new semester’s Mail Merge spreadsheet.
- Copy all three percentage columns from the exported CSV and paste them into the three “NC, LC, NCLC” columns in the spreadsheet.
- Close the CSV file. Keep the Mail Merge spreadsheet open.
6.5.2.3 Export the grant savings data from Power BI into the spreadsheet.
- Return to the “Provost Dashboard” Power BI report.
- In the “Rankings by Institution AT” tab, click on the “Transformation Grants” table.
- In the right-top corner, click on the three dots.
- Click “Export Data.”
- Save the CSV file in the Mail Merge folder. Rename it Savings_[Semester].
- In the Mail Merge folder, open the CSV file, click “Insert,” then “Table,” and make a table out of the data.
- Sort the list by institution A-Z.
- In the column to the right of the first savings value, enter =DOLLAR(C2). This should make a text value out of a dollar value.
- Copy this formula for the rest of the table: Ctrl-C, Ctrl-Space, Ctrl-V.
- Copy this entire new column.
- Using Ctrl-Shift-V, paste only the values from the column to the Mail Merge spreadsheet’s Savings column.
- If you get a “VALUE!” error, you have pasted the formula and not the values. Copy again and paste with Ctrl-Shift-V.
- Close the CSV file.
6.5.2.4 Copy and calculate the classification and systemwide percentages.
- Sort the new semester’s spreadsheet by the Classification column, A-Z.
- In “Provost Dashboard” > “Totals,” change the “Classification” slicer to “Comprehensive Universities.”
- The percentages for that semester will show in the Markings Comparison tab.
- In the Mail Merge spreadsheet, enter the No-Cost % value in the ClassNC column and the Low-Cost % value in the ClassLC column.
- The ClassNCLC column should update automatically.
- Repeat this for the other three classifications.
- Enter the systemwide percentages at the bottom of this report's page into the spreadsheet under SystemNC and SystemLC.
- You now have all of the numbers you will need for this spreadsheet. Close the Provost Dashboard.
6.5.2.5 In the spreadsheet, make notes on each report.
NOTE: Be careful of note length. If notes are over 255 characters, Mail Merge will cut the note off. Be sure to check this in the Ribbon's Mailings tab using "Preview" before sending the Mail Merge emails.
- Re-sort the spreadsheet by institutions, A-Z.
- Open each printed report for each institution.
- You could also use the Provost Dashboard to view this, but I’ve found the PDF checking method to be more clear. Page 2 helps the most.
- Institutional abbreviations in the filenames and full names of institutions in the spreadsheet won’t line up alphabetically – be sure you’re opening the right report as you go down the institution column.
- Mention what you are noticing about the report, especially any year-over-year trends. Thank institutions if things are going well or if efforts are being made to improve.
- If you know about an effort that’s happening regarding NC/LC at the institution, mention that here.
- See previous spreadsheets for examples.
6.5.2.6 Update all contacts in the spreadsheet.
- Update the VPAA names and email addresses.
Update the ALG Champions.
- Ensure the ALG Champions spreadsheet is up to date with what’s listed on the website and fix as necessary.
- Copy and paste the ALG Champions into this semester’s spreadsheet to ensure accuracy there as well.
- In ChampionsList, the Name Concatenate formula converts all Champion names into a list. Watch out for the syntax in the AdditionalChampions column, which is made to end the list if there are Champions.
- Put this formula in the top cell: =CONCAT([@[Faculty Champion]],", ",[@[Library Champion]],", ",[@[Design Champion]], [@AdditionalChampions])
- Copy the cell down the column and replace the column with values only (Ctrl-Shift-V or right-click paste 123 icon).
- Manually format this list as appropriate for missing Champions, empty commas, or additional information.
- In ChampionsEmailList, the Email Concatenate formula converts all Champion emails into a list for entry in Outlook.
- Put this formula in the top cell: =CONCAT([@FCEmail],"; ",[@[LCEmail]],"; ",[[@DCEmail]], [[@AdditionalChampionsEmail]])
- Copy the cell down the column and replace the column with values only (Ctrl-Shift-V or right-click paste 123 icon). #### 6.5.2.7 Upload the PDF reports to the ALG website.
Go to the website admin (Expression Engine) at https://affordablelearninggeorgia.org/admin.php.
Go to Entries > Our Impact > [semester].
The “Our Impact – Institution Reports” field is a file grid. Remove the old file with the “X” button, and then drag-and-drop the new PDF.
6.5.2.8 In the spreadsheet, link the uploaded PDF reports.
- In Expression Engine, “Files” will list all the files you’ve uploaded. Copy the link by right-clicking the “link” button on the right side of the list.
- You could also select the URLs in the spreadsheet, hit CTRL-F, and replace the past semester (like Fall2024) with the current semester (like Spring2025). Just be careful that all the filenames are the same. In a year with a merger, this will be important to watch.
6.5.3 Send the Provost Reports.
6.5.3.1 Advise the Champions that you are sending out the Provost Reports.
- Send an email to the ALG-CHAMPIONS-L listserv.
- Use the Mail Merge spreadsheet to give everyone a quick overview of where things are going to be sent – copy and paste the four first columns with the institution, URL, VPAAName, and VPAAemail.
6.5.3.2 Send the email.
Note: In this particular case, the “Mail Merge” Word document is not meant for sending a mail merge directly. This is because Mail Merge only works as one email per mailing and we need to CC the ALG Champions. Finding a new way to do this would be great, but for now, we use the Preview Results button under Mailings in Word, then move to the institution, then copy and paste, then email it manually.
- Open the “Provost Emails” document file.
- It will ask you if you want to continue. Click “No.”
- You will then access the document without any of the mail merge linked.
- Click “Mailings,” then “Select Recipients,” then “Use Existing List.”
- Select this semester’s spreadsheet.
- The fields should populate the same way they did in the last semester. Click “Preview Results” to check.
- Be sure to change the mentioned semester in the first paragraph to the new semester. This will not change automatically.
- Click “Preview Results” to check how each Mail Merge looks.
- Ensure that for all emails, the Notes field does not have a cutoff due to length. You’ll have to fix this in the spreadsheet if it does.
- Using Outlook and the preview of each Mail Merge email, copy and paste the email and send to the appropriate Provost in the "To" field and all Champions in the "CC" field.