top of page
  • Writer's pictureSantiago Peñate

Business case: A common software development framework for the energy transition

Updated: Jul 12, 2022


Summary

  • The integrated framework is 1300% more efficient than the business as usual.

  • 2 million euros in productivity savings can be expected in a 5 year period.

  • The technical implementation is expected to take about a year.

  • Once removing the technical barriers, incremental developments are possible.

  • Resistance to cultural change is the main drawback.


Introduction

We are immersed in the great transition from fossil fuels to cleaner energy sources. To enable this transition, we need to perform an astounding amount of calculations, but the methods and tools that are commonly used are sorely lacking. The tools’ interoperability issues have become the bottleneck in the decision making processes related to energy infrastructure. There is too much at stake to continue with the business as usual.


The modeling situation

Investment decisions in the electrical sector have always been hard to make because the effects of adding or removing equipment in an electrical system are not linear. For instance, if we see that installing a 3MW transformer near a city produces a benefit, we cannot ensure that installing two of them will produce twice the benefit. It may be worse, it may be better. This is due to the different network effects that one or two transformers produce depending on their location and utilization once installed. The network effects are counterintuitive to the human brain, hence the need for specialized software.


In the past, energy sources were predictable (coal, nuclear, hydro, oil and gas), so the grid investment studies were simpler; The study of the peak demand and valley demand were sufficient to assess the performance of an asset. This is no longer a valid approach. In the last 20 years, a significant part of energy produced has evolved from being stable and polluting, to being variable and less environmentally damaging. Suddenly, the evaluation of a couple of scenarios could not determine whether a given investment was going to be sufficient or not due to the variability introduced by the solar radiation and the wind. To this uncertainty, we must add climate change and the instability of the fossil fuels supply.


Like every great challenge, there is more than one aspect to it. If we break down the exact issues becoming bottlenecks to policy and investments decision making, on one side, we have the excruciating difficulty to collaborate when creating models of the infrastructure, and on the other side, the model evaluation and debugging times that span from hours to weeks depending on the model accuracy and size. Sadly we must add that the mainstream software programs are deliberately not interoperable, making it very hard to build lean processes that alleviate the workload.


Naturally, these issues produce delays, unsatisfactory results and rotation in the engineering teams. This situation has been going on for at least 15 years and there are few efforts at the moment moving in the right direction.


Reinventing the wheel

If we look at the described situation with fresh eyes, we cannot help but wonder, why is everything so old fashioned? Why is there no “google docs” for models? Why is there no “Git” for models? Why do I have to keep sending excel files over email? (well, maybe just dropping them into a shared folder) Our conclusion is that there has been a lack of innovation because the dominant software is very hard to replace, because to do so requires reinventing the wheel; reinventing power flow, reinventing optimal dispatch, reinventing the file formats, etc.


Solving the collaboration issue; The field is dominated by closed standards and formats. Mainly those matching the mainstream software manufacturers’ views. What about a simple and open electrical modeling format? While working at the planning department of REE, one of the main things we did was to design a simple yet complete Json file format for exchanging data among all applications. Later this led to the creation of a collaboration and tracking system that allowed users to create models with full tractability. CIM/CGMES, despite being a kind of standard format, was out of the question due to its gratuitous complexity. Simplicity is the ultimate form of sophistication, or so we believe.


Solving the interoperability issue; The base situation is usually that all the modeling is done with a closed source program that becomes the centerpiece. This constraints innovation and forces users to come up with maybe too creative solutions to hack this. We discovered that the best strategy was to have such programs as side pieces of an open modeling system that uses the open and simple format mentioned before, so the hacking is not needed at all. This allows users to continue using their preferred software while allowing competition and innovation to improve the data processing pipeline. For instance, now we can run hundreds of simulations in the cloud with a custom developed calculation engine, while still being able to run one of them in the desktop with the commercial dominant software.


Solving the computation capacity issue; The commercial software today was designed in the 1970’s and it has been made abundantly clear that they’re not going to change the design to new computational paradigms. Fine, we have just detailed how to get out of the closed ecosystem by having our simple and open modeling format. This allowed us to build a competitor piece of software to run on the cloud at scale unlike no manufacturer product today. More calculation capacity equals greater ability to run more scenarios to better understand the energy transition. A simple and open file format and computer programs developed around it made us free to solve the collaboration, interoperability and computation capacity issues.


The cost of business as usual

The typical planning business as usual workflow involves the use of traditional power systems (grid modeling) software and the use of a market modeling software to simulate the market effects of the electrical infrastructure. The process involves quite some file preparation and model adaptation. The diagram looks like this:


Evaluating a single decision costs at a employee internal rate of 40€/h:

Step

Time

Cost

Design changes on the grid modeling software

*8 h

320 €

Prepare the input data coherently for the market model

168 h

6.720 €

Adjust the market model results to the grid model software

40 h

1.600 €

Prepare a report

8 h

320 €

Total

224 h

8.960 €

* These times may vary.


We can do better

The business as usual workflow has plenty of steps in which a person has to intervene to adapt the data for the next step. That is a source of friction where mistakes can happen and people get frustrated.


If we observe the modeling workflow from a processes point of view, we immediately find that we have many “sources of truth” that evolve over time with no tractability. We also observe that we are forced into that because the grid modeling software and the market modeling software are incompatible. What if everything was compatible? Then we arrive to a much more lean process where the employees only need to intervene where they add value:


In this process, the data resides in a single place, where the modeling software loads and saves the different models coherently and tractably. Then the modeling software can be provided with custom work routines such as Electrical plus market simulations that produce coherent results, which are processed into automatic reports. The comparable cost scheme improves radically:

Step

Time

Cost

Design changes on the grid modeling software

*8 h

320 €

Prepare a report

*8 h

320 €

Total

16 h

640 €


1300% cost improvement


Now let’s say that we run 50 of these processes per year. With the business as usual approach that is 448k€ per year only on work labor, not counting the ad-hoc software maintenance or the process variations that may make that pipeline even slower. With the improved workflow, the cost of performing the same 50 runs is 32k€ per year. That makes up over 2 million € in productivity savings in five years.


Timeline, side effects and drawbacks

We have outlayed how to make the decision making process leaner and we have calculated the economic benefit. How do we implement the new data architecture? The experience in the planning department of REE is that everyone in the department was aligned with the need for change. After presenting the solution, everyone concluded that the need for a central source of truth was imperative. However, changes take time. In our case there has been a 3 year changing period that includes the research and development to produce the software designs and implementations. There is still a fourth year to go. This long period is due to the fact that we have been changing the workflow while the old workflow was still being used. It would be reasonable to expect that similar change would take about a year in an organization with a clear implementation project, where the software was not an unknown variable like it has been for us.


We have streamlined our decision making process, but far more importantly, we have removed most of the technical barriers for innovation. Now, if we need to add a field to the data model, we just do it. We don’t need to beg the manufacturer anymore. The same goes for the development of new functionality; if we need a new specialized type of simulation, we can develop it in our framework incrementally. The benefit of being able to do something that was impossible before is infinite.


The change has been quite radical. This does not come free from resistance; not everyone shares the same vision of the solution and many are still emotionally attached to the closed manufacturer's solutions. Mostly because of the personal edge gained from being experts in certain manufacturer software. Naturally, if those software and workflows lose importance, those people feel that they get relegated as well. The reality is that those people become more capable than before to provide their best insight, but it is difficult to recognize it at first. Certainly the resistance to cultural change is the most important drawback.


Summary

  • The integrated framework is 1300% more efficient than the business as usual.

  • 2 million euros in productivity savings can be expected in a 5 year period.

  • The technical implementation is expected to take about a year.

  • Once removing the technical barriers, incremental developments are possible.

  • Resistance to cultural change is the main drawback.


318 views0 comments

Recent Posts

See All
bottom of page