Innovation

Supercomputing drives the future of wind power

How RES is using advanced modelling techniques to select the best locations for wind farms...
RES is using high-performance computing

RES is using a supercomputing cluster to advance its modelling techniques when planning new wind farmsPhoto: Tim Ferguson/silicon.com

Wind power and renewable energy developer Renewable Energy Systems (RES) is making extensive use of computer modelling and high-performance computing to improve the planning and location of its wind farms.

The company - which was founded as part of the Robert McAlpine construction firm in 1982 - has been building, managing and running wind farms since 1992 and is also working with other renewable energy sources such as solar, biomass and tidal.

The overall power production of RES wind farms now stands at 5.3GW with 750MW owned and operated by the company.

The company's wind farm projects involve scoping out sites for suitability, obtaining permission to develop them, sourcing turbines and building the facility. When selecting wind farm sites, the company makes uses of advanced computer modelling - including the computational fluid dynamics (CFD) also used by Formula One teams for aerodynamic work - to determine wind behaviour at potential sites.

To deal with the demands of the modelling work, the company upgraded its high-performance computing (HPC) cluster about 18 months ago with 16 Dell M610 blade servers using 32 Intel X5550 quad-core processors and a 20TB Dell PowerVault storage array.

The system uses Red Hat Linux and is based in sister company Robert McAlpine's datacentre near its Kings Langley head office in Hemel Hempstead, Hertfordshire.

Although RES takes physical data using meteorological masts when assessing wind farm sites, the modelling process allows teams to determine how other factors, which are difficult and expensive to measure in sufficient detail, could affect the site. The modelling helps not only determine where to locate wind farms but also where to locate individual turbines within the sites.

RES technical manager Peter Stuart explained that the wind farm planners use the HPC mesoscale modelling to create 2km scale wind maps that take into account areas that can't be used, such as sites of special scientific interest, to help determine the suitability of locations.

This modelling uses local data combined with the meteorological global circulation information to simulate the movement of wind around sites. "We want to create as accurate wind maps as possible. That hopefully means we can cherry-pick the best sites," Stuart told silicon.com.

RES also models wind behaviour within wind farm sites in detail to help determine the location of individual wind turbines and even the types of turbines to be used.

This work uses a combination of linear data modelling - which doesn't take into account certain mathematical variables - for simple landscapes and non-linear modelling to reflect the behaviour of wind in more complex topologies, such as calculating the effect of trees.

The linear models are relatively simple and are processed on Windows-based desktop computers while the non-linear CFD work takes place on the HPC system.

However, neither of these systems is currently linked to the mesoscale information, making them less accurate.

In addition, the non-linear modelling doesn't currently take into account non-neutral data such as air buoyancy - a way of measuring turbulence as the air reacts to the differing rates at which ground and air change temperature.

To tackle this issue, RES is working with the University of Porto in a joint venture to develop a CFD modelling technique that adds buoyancy and is also able to make use of the data from the mesoscale modelling.

"With the computing power available we can start these things that we previously wouldn't have been able to do," Stuart said.

The further development of the modelling techniques will ensure RES is able to become increasingly accurate when scoping out wind farm sites, resulting in more efficient wind power generation in the future.

RES hopes to integrate this new model into its HPC work by the end of 2011.

Editor's Picks