Chalk, talk, and energy efficiency: Saving electricity at South African schools through staff training and smart meter data visualisation

South African schools are finding electricity costly in terms of money that could be spent on the primary purpose of educating children, but they tend not to be proactive about these costs. Managing the monthly bills is a huge task: the usage numbers on the bill are incomprehensible, the time-of-use tariff structure is bewilderingly complex, and the person who deals with payments is usually an overburdened teacher. This paper describes a controlled behavioural experiment carried out at five schools in Stellenbosch. We provided highly visualised and intuitive daily, weekly and monthly reports to three of the schools, the other two acting as controls. The reports were shared with teachers, principals and the governing body to encourage accountability. Usage was shown as a colour-coded energy and financial component per hour and per tariff class. Using the difference-in-differences method, we found financial savings of 11% and 14% at two of the schools and 13% at the third school, where staff were also trained on the tariff structure before reporting started. These results suggest that sharing high-frequency, digestible information could help South African schools reduce their energy bills; a saving that has particular relevance given the countrys dependence on fossil fuels.


Introduction
Against a background of global concern about how to save energy, one user that is often overlooked is the school sector. South Africa's 25,348 schools are heavily reliant on electrical energy for lighting, heating, and educational activities. Besides the obvious benefits to the environment, the already resource-constrained education system stands to benefit financially from any savings on electricity bills that could be reinvested in critical educational needs. South African schools are in a dire state (Spaull, 2013). New attitudes can be achieved through technological and behavioural interventions, which could have spin-offs in encouraging involvement in initiatives to save the environment (James and Ambrose, 2017;Kok et al., 2011;Sloot et al., 2018;Staddon et al., 2016;Sovacool, 2014) At the time of writing, demand for electricity was outstripping supply in South Africa, with load-shedding a occasional occurrence. The ailing state-owned power utility Eskom is struggling to keep up with necessary maintenance and its financial woes mean respite is not expected soon (Thesouthafrican.com, 2018). The power interruptions are severely affecting businesses and damaging the economy, and schools are also severely affected, though they do not make the headlines. With their reliance on electrical equipment for basic educational needs, such as data projectors, computer laboratories, in-class monitors and administration services, schools are also sensitive to frequent interruptions in supply (eNews Africa, 2018;EWN, 2019).
With the legacy of apartheid and the Bantu Education Act of 1953 hanging over its education system, South African schools need every cent to be well spent towards achieving good education for all (Unicef, 2018). But research shows they are failing in basic management of their finances (Myende, Samuel and Pillay, 2018). Schools tend to overspend, especially on maintenance, and per-pupil education expenditure has declined by 8% in real terms since 2010 (Gina, 2018;Spaull, 2019).
Our study focused on changing behaviour at schools to achieve financial savings by sharing straightforward energy-usage information from smart meters. The following is a brief review of literature relevant to the study.
Energy-saving interventions are of two kinds: technological, involving technology or appliances that increase efficiency, and behavioural, where the aim is to change human behaviour or usage patterns (Allcott, 2011;Sovacool, 2014). In a large-scale questionnaire study, (Sloot et al., 2018) found that influencing behaviour was an effective way to get people involved in community energy-saving initiatives.
However, Frederiks et al. (2016) found that behavioural interventions often produce unreliable results because they are not well designed. They provide handy guidelines to help improve such interventions. To ensure reliable and unbiased evaluation of an intervention, they recommend using the following criteria: a sufficiently large, clearly defined, and representative target population and control group, standardised delivery of treatment, running single treatments at a time, valid and reliable measures of behavioural outcomes, and appropriate duration and continuity of the programme. A good programme is one that achieves the maximum score on these criteria. They also recommend the use of large-scale randomised controlled trials (RCTs). Such trials are of course costly and complex to administer, so they may not be possible for school programmes. These recommendations are echoed by Sovacool et al. (2018), who bemoans poor experimental design and subsequent writeup of energy experiments.
In a pilot project in Switzerland, Wemyss et al. (2019) conducted a behavioural intervention based on a mobile phone 'app', called the 'Social Power App', that combined energy information sharing with social media gaming. They formed virtual communities that engaged cooperatively in energy conservation through energy feedback intended to trigger behaviour change. They found that investing in a cooperation, collaboration and feedback technology like this could be useful to support behavioural change. They reported up to 8.5% electricity savings in the short term (three months after the intervention) and the Social Power participants self-reported more saving behaviour even after a year. Allcott (2011) and Klege et al. (2018a) both demonstrated that electricity usage can be efficiently reduced through behavioural interventions. Allcott (2011) found that sending reports on energy usage and recommendations on energy conservation to households in the United States, comparing their usage to that of their neighbours, reduced usage by 11 to 20%, which was the equivalent expected impact of a short-run price increase. Klege et al. (2018a) sent different behavioural nudges containing comparative energy usage information to 24 floors of a government building in South Africa. They reported savings ranging from 9% to 14% depending on what information each floor received. Bager and Mundaca (2017) was able to achieve a differential effect on energy usage of 16 to 25% though framing of usage information in a behavioural study, using smart meter data and an online dashboard.
In their analysis of a South African university's management of its electricity bills, Maistry and McKay (2016) found that sub-metering was inaccurate or non-existent and operational budgets were limited. They found that the institution failed to analyse its bills. It did not use academic knowledge to make sense of the bills. There was no training. All this contributed to managerial failure and behaviour that did not promote energy conservation. They concluded that the key to managing energy bills in a large institution was to raise awareness and understanding and, importantly, to share usage information.
In one of the few studies we could find of energy usage in schools, Dias Pereira et al. (2014) used multiple sources to produce a benchmarking system for typical schools in specific geographical regions (assuming similar climate conditions). Such a system could help schools manage their energy usage and reduce costs. A similar helpful system is suggested by Larrumbide Gómez-Rubiera et al. (2019) through energy audits. We could find no studies that evaluated the impact of energy saving at schools, despite the fact that they often struggle to manage their electricity bills.
To help alleviate the financial burden of electricity at schools and to reduce their environmental footprint, we performed a small controlled experiment involving five schools in Stellenbosch, South Africa. We evaluate the results, quantified the savings in energy and in cost, and assessed the temporal changes in usage. This behavioural intervention consisted of an information-sharing component at three of the five schools and a training component was performed at one of these schools, and the remaining two schools constituted the control group.
We agree with Castri et al. (2014) that, as schools are a natural extension of a community, an intervention in a school may have ripple effects in the community. Following the lead of Frederiks et al. (2016), we used a standardised delivery method with valid measures and a clearly defined target group. Our difference-indifferences approach was based on Allcott (2011). We analysed primary and secondary schools separately, as

The experimental design of the Stellenbosch schools study
The Stellenbosch region, in South Africa's Western Cape province, has 39 schools. For our experiment, we chose schools in close proximity with existing smart meters: three primary schools, which we labelled Schools A, B and C, and two secondary schools with boarding hostels, labelled Schools D and E. As all participating schools had existing municipal smart meters no additional smart meters were purchased for this study. The data were extracted, processed and emailed with a Python script, providing ease of scalability. Table 1 shows the numbers of pupils at the schools and the dates of the interventions. The study took place from Tuesday 9 to Tuesday 30 October 2018, and the measurements were compared to the previous year's energy usage from Tuesday 10 to Tuesday 31 October 2017. Schools A, B and D were the treatment group, receiving reports about their energy usage, and schools C and E were the control group that did not receive this information.
In addition to the reports, we also made ten-minute presentations to the teaching staff and the cleaning staff at School A on the causes and effects of greenhouse gasses and climate change and the financial, energy and emissions implications of typical devices used at schools, and, importantly, we explained the municipal electricity tariff structure. Preparation for the training was less than two hours for the first school, and less than 30 minutes for the second school. To scale this intervention, the training could be provided as an online video, but the data in the presentation needs to be customised to reflect the schools' actual usage from the smart meter data. The presentation can be found in Appendix A.
Savings can be achieved through using less energy to perform the same tasks (improved efficiency), or by changing the time of use into an off-peak period (load shifting) (Klege et al., 2018b;Larrumbide Gómez-Rubiera et al., 2019). Both of these methods of saving can be achieved through either behavioural change or technological intervention. Behavioural change leads to improved efficiency when the user interacts differently with the energy-consuming device, by, for example, switching lights or heaters off when not absolutely needed to perform the task, or not filling the kettle to the brim. Behavioural change could also result in using delaytolerant loads at different times, for example boiling an urn 15 minutes later to not coincide with peak tariff periods. Technological interventions result in improved efficiency in numerous ways, for example replacing lights with energy efficient lighting, improving pipe or roof insulation, or applying optimal water heater temperature control. Technological interventions are also well-suited to do load shifting (e.g. water heater and pool pump scheduling, cascading air conditioners). Although the distinction may appear clear, there are various overlaps. For example, when a principal notices that the water heaters turn on during peak hours, he could choose to purchase timers, to schedule them for off-peak hours. This mechanism, includes an action (subtle change in behaviour) from the principal, but also a technological intervention. Another example is the deployment smart meters to detect after-hour leaks or when a pipe fails, but the smart meter data could have a direct impact on how the user behaves. It has been shown, as mentioned in the previous sections, that pure behavioural interventions require effort to be sustainable. Technological interventions tend to have a more lasting effect, if they are maintained and do not fail technically.
In this experiment, we treat the schools as black boxes, and do not distinguish between behavioural and technological interventions. In fact, we treat all changes as "behavioural", since someone at the school had to act to deploy even technological interventions. It would stand to reason that most of the effects observed will not be lasting if not maintained, with behavioural results tapering off first.

Reports
We collected data from the municipal smart meters that are used for monthly billing. These thirdparty meters record and remotely report energy and peak load at half-hourly intervals. The readings can be downloaded as comma-separated value (CSV) files from the online platform Livewire (2006), for offline processing. The full dataset can be found in Appendix A, Supplementary material.
Our experience at the schools and discussions with the staff responsible for energy management made it clear that the schools did not understand the data in the municipal bills, an example of which is shown in Figure 1. We therefore processed the data and presented visualised reports that would be easy to understand and compare with reports from other weeks. The principals and staff were unfamiliar with the schools tariff structure, which is based on time-of-use, whereas household users in South Africa pay a flat fee for energy used. Our reports therefore provided not only the energy used per hour but also its cost. We also took into account a non-technical reason for the poor management of school electricity bills. We realised there is little incentive for the responsible staff member to engage with the bill, get to grips with its complicated data and act on the information it contains, since this staff member does not pay the bill out of his or her own pocket. There is none of the thrift and caution that would be exercised at home. We therefore copied into the emailed reports the principal and responsible person from the school governing board, to create a feedback loop and build a sense of shared responsibility.
The information was shared in the form of emailed daily, weekly and monthly energy reports. Although the period of the study did not extend beyond the end of a month, we sent a monthly report on the first day of the experiment in lieu of a historic weekly or daily report. Figures 2, 3, and 4 show examples and extracts from the reports. The full reports can be found in Appendix A, Supplementary material. Figure 2 shows the daily report as sent out to the schools. This report contained the hourly analysis, the peak demand for the day, the total energy used for the day and the related financial cost, and the total energy used in each of the municipality's tariff periods: off-peak, standard and peak Stellenbosch Municipality (2018). Figure 3 shows the graph from the weekly report. This report contained the average cost per weekday, the average cost per Saturday, a breakdown of the costs, and a summary of the energy and peak demand. This report made it easy for the school to compare the energy used and the cost of that energy. Our breakdowns of energy usage and cost for the week made it clear that the cost was not linearly proportional to the amount of energy used. This is why we emphasised the tariff structure. Figure 4 shows the pie chart from the monthly report. This report contained the energy and cost information for the past month, including a monthly total of all the measured quantities, the cost of energy per day, the average hourly cost of the energy used per day, the pie chart visualising the cost breakdown, and the expected amount of the monthly bill. The pie chart captured the overall costs for the month divided into   the contributing segments, to help the school identify its energy usage patterns.

Metrics
We measured and evaluated the impact of the intervention by comparing the treatment group to the control group while taking into consideration the previous year's usage using the difference-in-differences (DiD) method, a method that is intended to mitigate the effects of extraneous factors and selection bias. We obtained the previous years energy usage for the same dates, shifting by a day so the two periods would start on the same day of the week.
The main metrics we used were: the total energy used in kWh both as an absolute and as a percentage of the previous years usage; the amount of energy used, in kWh, during the hours in which peak rates are charged; the peak demand for the time under consideration in kVA; and the effective financial savings due to the energy saved. These metrics provide an unbiased account of behaviour change (Frederiks et al., 2016;Sovacool et al., 2018), and place the results in the context of financial gains, the reduction of which is the main objective of the intervention.
We collected this information from all five schools in our sample. We then used the DiD method to calculate the energy usage difference after our intervention and compared the usage figures for the treatment group with those for the control group. This method eliminates differences in external factors (such as electricity prices, and weather differences that would cause different electricity usage) and makes them the same for all the schools involved. This means we can expect the difference measured to be the result of our intervention (Allcott, 2011;Frederiks et al., 2016;Sovacool et al., 2018).

Total energy usage change
The absolute energy used increased from the previous year for all the schools except School D, as shown in Table 2. In the year before the intervention, the three treatment schools and the two control schools had all followed the natural pattern of schools increasing their usage annually. However, the two primary schools in the treatment group, A and B, managed to keep their usage similar to the previous year's during the intervention period, while the secondary school in the treatment group, D, managed to significantly reduce its usage.
However, when we take the relative differences into consideration by applying the DiD method, we find a reduction in usage for all three schools in the treatment group, A, B and D, as shown in Table 2. The DiD analysis shows that School A reduced its energy usage by 11%, School B by 12% and School D by 25%.
To further validate the observed change, we evaluated the three treatment schools for the term that preceded the intervention, which ran from July to September, for 2017 and 2018. We observe for this period an absolute increase in energy use (kWh) of 62%, 64% and 43% for treatment schools A, B and D. These relate to DiD changes of +13%, +3% and -10% respectively when compared to the control schools. This result shows that schools A and B were in fact using more than the control school C before the intervention, giving even more credence to the impact of the intervention. On the other hand, it shows that almost half of School D's improvement occurred before the intervention, which also explains its outlying profile seen in Figure 5, which is described in a next section.   Table 3 shows the absolute change in energy used during the different tariff periods for each school in 2017 and 2018. The treatment schools A, B and D managed to reduce their usage during peak hours even when considered in absolute terms, while their usage in the standard and off-peak hours increased slightly. School D managed to reduce its usage in absolute terms in all three tariff periods. School A reduced its usage during peak hours by 7%, school B by 1%, and school D by 17%. Table 3 also shows the relative reductions calculated with the DiD method, with respective peak hour reductions of 18%, 11% and 23%. It is interesting to note that School A is the only treatment school that received the staff training intervention, which could explain why their peak hour reduction is larger than their reduction in the other periods. The usage for schools in the control group increased for all the tariff periods. The results suggest that understanding the tariff structure and knowing the actual usage per hour showed the staff the importance of shifting the usage load away from the peak hours. However, the results do not show whether these loads were in classrooms and being shifted by teachers, or extra-classroom (e.g. swimming pool pumps and boreholes) and being shifted by the responsible personnel

Peak demand change
The peak demand is responsible for a substantial part of the bill. But it is not easy for a school to manage, given the large number of people responsible for switching on heavy energy users such as kettles, urns, and air conditioners at undetermined times. Table 4 summarises the peak demand we observed for the five schools. As with the energy usage figures in Table 3, the peak demand at primary Schools A and B was slightly higher than the previous years, both increasing by 8%, and at School C it increased by 22%. For the secondary schools, School Ds peak demand actually increased by 61%, while the control School Es peak demand increased by only 39%. However, the results show that Schools A and B reduced their peak demand by respectively 13% and 10% compared to School C as control school in the DiD analysis. The results also show that School D increased its peak demand by 38% compared to school E as control school in the DiD analysis. This unexpected increase may well indicate that the improved time-of-use behaviour without sufficient knowledge of the tariff structure and electricity principles may have had a detrimental effect on peak load. It seems to show that uninformed improvements in time-of-use behaviour may lead to unintended increases in peak demand. Conversely, an informed response to the information managed to reduce the peak load, albeit only slightly, by changing the schedules used for boreholes, water heaters air conditioners for Note: Using the ZAR to USD exchange rate for 9 October 2018, the total change in cost, ∆ DiD Total, was USD 99, USD 92 and USD 255 for treatment schools A, B and D, respectively.
example. This was the case for School A which had sufficient knowledge of the tariff structures through the staff training intervention.

Total cost savings
We translated the observed changes in usage into cost savings. To do this, we looked at the prevailing rates and what the financial savings would be for the change in electricity usage by each school, in terms of time-of-use energy usage and peak demand. Table 5 shows the changes in cost associated with time-of-use and peak demand using the DiD approach per tariff period.
The tariff structure rates we used were as follows: off-peak = 66.8 c/kWh, peak = 135.1 c/kWh and standard = 89.0 c/kWh, and peak demand charge of 40.1 R/kVA (Stellenbosch Municipality, 2018).
The combined effective savings for the three weeks due to reduction in energy and peak demand amounted to R1,460 (13%) for school A, R1,362 (11%) for school B, and R3,800 (14%) for school (USD 99,USD 92 and USD 255 at the prevailing exchange rate). Overall, energy usage was reduced, leading to effective cost savings. School D's peak demand increased, but it did not lead to an overall financial loss. The final effective savings were positive for the three schools, A, B and D, that participated in the intervention.
Again, we repeated the analysis for the term preceding the intervention, to validate that the effects are in fact only due to the intervention. The results show DiD changes in cost of -4%, +8% and +16% for the treatment schools A, B and D when compared with the control schools. This further confirms that the changes in cost are in fact to to a change in behaviour, especially due to moving loads out of periods of peak rates. Figure 5 shows the energy usage by schools A, B and D, aggregated and averaged per hour for weekdays. It shows the period before the intervention (2017) and the periods during and after the intervention, the values of which were calculated using DiD for each hour. The results highlight the impact of the intervention in reducing energy usage. It is especially noticeable how School A, which received the staff training, and to a lesser extent School B, managed to move usage into off-peak times and out of the shaded peak hours. School A shows the best performance as its load decreased during the peak periods and increased during the off-peak periods i.e. between 04:00 and 07:00. This shows evident shifting of load. Furthermore, all three schools increased their loads after the evening peak period (after 20:00) which is more evidence of such load shifting. School D managed to reduce its usage for the whole day, but especially from 10:00 to 13:00. A clear change is visible in School A's profile at 14:00, demonstrating staff participation in turning off equipment after school ended for the day. It is interesting to notice the effect of the hostel on the timing of School D's profile: the morning increase is an hour earlier, and the evening increase is more pronounced than at the two primary schools, A and B.

Conclusion
We described an experiment in which schools in Stellenbosch, South Africa, were given visualised information about energy usage and the cost thereof with hourly resolution. The experiment involved five schools, three primary and two secondary with hostels. Three of the schools, two primary and one secondary, were given reports daily, weekly, and monthly, and the two other schools acted as the control. We used the smart meters that are used by the municipality for monthly billing to measure energy usage in 2017 and compared it with usage in the same period in 2018, using the difference-in-differences method. The reports were shared with several staff members, not only the one responsible for managing the electricity bills. In addition, the staff at one primary school were trained on the time-of-use tariff structure, since it is a foreign concept to household users in South Africa. The training will likely have to be repeated once or twice per year due to teaching and training staff turnover and desensitisation. Such a training could also be done remotely as part of staff initiation using an online video, although the efficacy of such a method still needs to be established. The results showed a substantial usage reduction by the schools that received the reports, whereas there was no usage reduction at the schools that did not receive them. The primary schools reduced their costs by 11% and 13% (the latter being the school that received the tariff structure training) and the secondary school reduced its costs by 14%. Although our sample set is small and affluent, which could present a biasing effect, the results seem to indicate that sharing high frequency and easily understandable visualised information with the requisite parties at schools could help reduce their energy bills. A remaining challenge is the development of a framework to analyses and describe the behavioural change, and also to an analysis of the specific actions taken by the schools. Moreover, we recommend that electricity savings should be covered explicitly in the school curriculum and combined with similar smart metering-based behavioural interventions.