With Renewables, It’s Location, Location, Location

By Catherine
Share on LinkedInTweet about this on TwitterShare on FacebookShare on Google+

By Catherine Bolgar

As renewable energy projects multiply, and with momentum expected to accelerate thanks to the Paris Agreement on climate change, advances in modeling and simulation are improving site selection in order to maximize the return on investment.

A flood of new, mostly small entrants are providing simulations using meteorological parameters to compile detailed information about solar irradiation and wind conditions. “It’s relatively new. It’s disrupting the market,” says Nicolas Fichaux, senior program officer at the International Renewable Energy Agency (IRENA), the global intergovernmental organization for renewable energy, based in Abu Dhabi. “If you’re on a remote island in the Pacific, you can contact a private company and, for a couple of thousand dollars or euros, buy data about how much solar or wind power you have in a specific location. It will give you a good overview on the site quality. If you move ahead, you will combine it with local measurements to develop a bankable project.”

IRENA helps governments looking for the best sites for renewable energy projects. Considerations include not only the amount of sun or wind, but also topography, environmental factors and the distance to grid connections and population centers.

“We can help a government to select the combination of technology and area where energy will be the most cost effective,” Dr. Fichaux says. “We can say, ‘If you develop this cluster, this is the price of electricity you could expect, depending on the cost of capital.’ We can also test policies—for a given tariff, we can assess whether the returns will be fair. We can do this for every square kilometer on the globe. Three years ago, nobody could do that.”

The models use decades of public data from satellites, as well as other information, such as aircraft data and detailed mapping.

“We take the initial conditions—the larger-scale portrait of the atmosphere—and we improve the resolution and have more details of the flow characteristics,” says Gil Lizcano, Brussels-based director of research and development at Vortex, a cloud-based wind-modeling company headquartered in Barcelona. “We make meteorological models with very high resolutions of 100 square meters. You need to compute with high resolution to distinguish areas within a wind-farm domain.”

Clients may be project developers, manufacturers, governments or other entities. Locations may be greenfields—for example, a government wants to know the best places in a country for wind farms—or microsites, where the locations have already been chosen and the question is how to distribute the turbines for maximum effect, including their spacing and height. Technical advances have increased turbine capacity, blade length and tower height.

“Now modern turbines are 80 to 120 meters, even higher,” Mr. Lizcano says. “This 40 meters can make a lot of difference. Wind increases with height, and we need to know how much.”

Turbulence is another factor, reducing machine performance and lifespan; stronger machines are available but more expensive, and the client needs to know whether they’re necessary and worth the investment, he says.

Similarly for solar irradiation, the SoDa service combines a database of images taken every 15 minutes for 12 years by the Meteosat satellites with geographic data, such as altitude and land cover, to show where solar irradiation is high or low, with a resolution of a square kilometer. “On the scale of a country, this is very precise,” says Etienne Wey, general manager of Transvalor Innovation, which operates SoDa and is part of Transvalor SA, a Mougins, France, company that works closely on research with the Mines ParisTech engineering school.

Steep slopes, forests, swamps and farms aren’t ideal for solar plants; neither are places that might be very sunny but are far from population centers. “We use techniques of exclusion to filter the area with other data, then rank the site based not only on solar irradiation but also on, say, the distance to an electrical sourcepoint, because it has to plug into the network, or availability of water for cleaning mirrors if it’s concentrated solar power. Then we make a map with the ranking,” Dr. Wey says.

“What we have added are tools where you click on a point on a map and we will give the amount of photovoltaic production you can have at any place. Also, how much hot water you can create if you put in so many square meters of solar hot-water collectors,” he says. “We are trying to polish our crystal ball.”

For a successful integration of renewables in the electrical network, a key element will be the ESS – Energy Storage Systems. Such systems – commonly called batteries – will ensure a minimum level of energy availability as well as a good level of energy quality. Whereas only around 2 GW is installed today, IRENA estimates that the world needs 150 GW of battery storage to meet the desired target of 45% of power generated from renewable sources by 2030.

Once sites are narrowed down, customers need to complement the satellite data with measurements taken on the ground. “The amount of radiation over a year could have an error of 3%-5%,” Dr. Wey says. “On a global scale it’s not much, but as competitive bids for solar plants are more precise and cheaper, missing a prediction by 2%-3% could mean losing money. The ground measurements correct any bias in the satellite data.”

Other companies are creating prediction software to forecast the output of a photovoltaic farm or wind farm over the next three, six or nine hours, so the operators can bid on electricity markets and maximize revenues, Dr. Fichaux of IRENA says.

“There is a global understanding that there is a lot to be done on renewable energy,” he says. “We’re ready to move, but we need to estimate how big it will be, how much it will cost and how it will take place. It’s creating a boom in detailed modeling.”



Catherine Bolgar is a former managing editor of The Wall Street Journal Europe, now working as a freelance writer and editor with WSJ. Custom Studios in EMEA. For more from Catherine Bolgar, along with other industry experts, join the Future Realities discussion on LinkedIn.

Photos courtesy of iStock

How should autonomous vehicles handle privacy?

By Catherine
Share on LinkedInTweet about this on TwitterShare on FacebookShare on Google+

By Catherine Bolgar

The development of autonomous vehicles raises a host of questions about data collection. Some of the issues may arrive as vehicles incorporate more automated systems and become connected to each other and to infrastructure—before they are fully autonomous. Here are some questions likely to arise.


What kind of data might be gathered about vehicles’ movements? By whom?

Fleet operators would likely have some data on their customers, just as the current ride-sharing services already do, says Kara M. Kockelman, professor of engineering at the University of Texas-Austin. “If I order a car, I will have to have a credit card on file for trip charges,” she says. “People already are giving that information up.”

Worries about unauthorized credit-card data replication would be the same as for current car services.

Insurers may want to collect or have access to data to establish liability in case of a crash, to determine weather or road conditions or whether the technology malfunctioned. California law requires autonomous vehicle makers to have a way to capture and store sensor data for at least 30 seconds before a collision and to keep it for at least three years after an incident.

California also requires that autonomous car makers disclose to buyers what information is collected by the technology on the vehicle, although it doesn’t specify whether the data would be gathered by the manufacturers, government agencies or private companies, or a combination.

For example, “location data, such as the day and time, and which road you’re driving on, would be valuable for transportation planning and understanding where the traffic is,” says Frank Douma, director of the state and local policy program at the University of Minnesota in Minneapolis.

A company collecting the data might want to sell it – already a common practice. “People are probably clicking through consent agreements far too quickly and don’t know if the company will share or sell the information,” Mr. Douma says.

People are putting their information on the open market without realizing it.”

Theoretically, home thieves could tap the data to monitor when occupants have left, but the experts say that’s probably more complicated than what burglars do now, which is just watch target houses in person.

Autonomous vehicles will use many kinds of sensors and technology, such as cameras, radar, LIDAR and GPS. Regulators may allow a vehicle’s GPS to keep track of itself but not share it with anybody, or to change identifiers, to preserve anonymity, Prof. Kockelman says.

The technology for connecting vehicles to each other, called connected car or dedicated short-range connection (DSRC), is under review by the U.S. National Highway Traffic Safety Administration (NHTSA). DSRC would use a bandwidth exclusively reserved for vehicle-to-vehicle communication.

DSRC technology “has built into it anonymization of information,” Prof. Kockelman says. “One vehicle will transmit to other vehicles its speed or whether the brakes go on, but it won’t identify itself as a particular vehicle owned by a particular person.”


Will autonomous vehicles be safe from hackers?

While autonomous vehicles are expected to be safer than human drivers, cybersecurity is a concern, as demonstrated by hackers who already have taken control of cars through existing connected systems.

“DSRC is important because it’s much more privacy-protected and secure than if the data were sent over the Internet,” says Dorothy Glancy, law professor at Santa Clara University School of Law in California. “It’s a dedicated network, a closed Internet of vehicles instead of the Internet of Things. That’s the debate right now: whether everything should be connected to everything else.”

Global Automakers, a worldwide industry group, created an Information Sharing and Analysis Center to assess cybersecurity in vehicle electronics. In addition, the European Automobile Manufacturers Association has agreed on secure principles of data protection for connected vehicles and services.


Who will update the maps?

One of the ways autonomous vehicles know where to go is through digital maps. It still isn’t known whether maps will be updated in real time—the way GPS applications currently readjust routes according to traffic conditions or construction work—or hourly or daily, or a combination.

A missed update could theoretically send an autonomous car into a construction zone, though some autonomous vehicles already are able to handle hand signals and flashing arrow signs. The bigger risk is likely to be bad weather conditions, such as snow, that cover lines on the road.

A consortium of European car makers acquired a digital map company that uses wireless transmissions to and from vehicles for updates. Other companies also have advanced mapping capabilities, protected by patents and trade secrets.

“Policymakers want these companies to get together and pool their information. In fact the very first item in the new ‘Federal Automated Vehicles Policy’ Vehicle Safety Assessment is ‘Data Recording and Sharing,’” Prof. Glancy says. “One of the features of the NHTSA guidance is for vehicle manufacturers to share vehicle performance data. Moreover, there’s no reason why you have to have two or three companies collecting the same information on the same roads. You want the most accurate and the most timely for all autonomous vehicles. Will that be a public function? Will it be done by municipalities or states? Will there be sharing across state lines? Who will hold the pool of mapping data?”

Like the autonomous vehicles themselves, answers to these questions remain largely a work in progress. Many stakeholders are working hard to enable the answers, especially on the technology that will connect autonomous cars with infrastructure and make mobility far safer.


Catherine Bolgar is a former managing editor of The Wall Street Journal Europe, now working as a freelance writer and editor with WSJ. Custom Studios in EMEA. For more from Catherine Bolgar, along with other industry experts, join the Future Realities discussion on LinkedIn.

Photos courtesy of iStock

Managing the Long Life of Industrial Equipment

By Catherine
Share on LinkedInTweet about this on TwitterShare on FacebookShare on Google+


By Catherine Bolgar

conveyer belt

Industrial equipment lasts a long time—the average age of current equipment in the U.S. is a decade, though some equipment might work for 20-30 years. During that time, it gets customized, modified and modernized, presenting a challenge of tracking those changes to avoid a repetition of previous design errors, as well as to ensure maximum reuse or recycling of parts.

A number of innovations are helping companies manage the task. Product lifecycle management (PLM) systems track equipment from cradle to grave, including the bill of materials (BOM), engineering attributes and other parameters. The Industrial Internet of Things (IIoT), which deploys sensors in equipment, can enhance PLM by gathering data about how the machinery has been used. And new paradigms for ownership are shifting the risk.

“You may have very little documentation, or everything necessary—even a 3D product model,” says Justin Rose, Chicago-based partner at the Boston Consulting Group.

When you have a 3D product model, you can update it in the virtual world, manage design changes, manage performance and design conditions.”

Multicolored pipes in a boiler roomFor example, the model will note whether the equipment has changed owners or locations, when and how it has been serviced, the rate at which it consumes energy and other inputs.

While 3D models are still in “early stages of adoption, companies that have very complex pieces of equipment, or expensive capital equipment, are investing the time to build a 3D product model,” Mr. Rose says. “For other companies, if they’re going to refurbish equipment, or if it’s a complex million-dollar-plus product, they might build a 3D product model to support execution of that service.”

But like any tool, the key to realizing value from such models is to get all the individual stakeholders to maintain it year after year. “Sometimes you see more turnover, or an aging workforce, and not having a 3D product model or not using it creates a real risk to the viability of the enterprise going forward,” Mr. Rose says.

The IIoT offers new ways to track how industrial equipment is used—from vibrations to heat to environmental conditions and more. “It’s possible to provide services such as preventive maintenance or monitoring energy consumption,” says Daniele Cerri, research fellow at the Polytechnic University of Milan, in Italy.

However, companies don’t always make the most of the data. “Technology enables a large amount of data collection, but providers of industrial equipment often don’t have a business vision of how to use this data,” he says, and cites an example of one company that produces movement equipment with embedded sensors. “They do it to copy their competitors, but they don’t know how to use that data on their products.”

In addition, the data may be stored in different software in different databases. “People may use too much time to find where the data are located,” Mr. Cerri says, adding that digital platforms that gather and analyze the data can help manage the information and deliver real insights.

Makers of industrial equipment and their customers sometimes have conflicting interests, Mr. Cerri says, because customers “are jealous of their information and don’t want to share it with providers, even if they can obtain more effectiveness and efficiency during the utilization of the equipment.”

By using standardized, modular design, equipment makers can customize equipment quickly to meet customers’ new requirements, Mr. Cerri says.

industrial equipment with pressure gaugeStandardization also aids with reuse or recycling of parts. Often machine bases can be reused “because they’re quite standardized,” he says. “It’s good cost savings and good environmental impact savings.”

Standardization and modular design also can help with another trend: industrial-equipment makers retaining ownership of the equipment itself and selling use as a service, billed hourly, for example. Modularization would help them easily adapt machinery to individual customer needs and make upgrades.

Companies already following this model use advanced analysis, 3D modeling and simulation tools to predict when it needs maintenance, especially because failure in fast-moving machines can cause much more damage beyond the failed part itself, BCG’s Mr. Rose says.

“If equipment goes down, they have to make the customer whole somehow,” he says. “It’s a strong motivation for them to keep it maintained and up and running.”


Catherine Bolgar is a former managing editor of The Wall Street Journal Europe, now working as a freelance writer and editor with WSJ. Custom Studios in EMEA. For more from Catherine Bolgar, along with other industry experts, join the Future Realities discussion on LinkedIn.

Photos courtesy of iStock

Page 1 of 1112345...10...Last »