Connect with us

Featured

Data and the stars

Published

on

An ambitious star-mapping project highlights the growing importance of big data and the cloud, writes ARTHUR GOLDSTUCK.

At an event in Berlin today, the European Space Agency (ESA) is unveiling the biggest set of data about the stars ever gathered. The positions and magnitudes of no less than 1.7 billion stars of our Milky Way galaxy have been gathered by the Gaia spacecraft, which took off in 2013 and began collecting data a year later.

The ship is also transmitting a vast range of additional data, with distances, motions and colours of more than 1.3 billion stars collected so far. And that is without counting temperature measures, solar system analysis and radiation sources from outside the galaxy.

“The extraordinary data collected by Gaia throughout its mission will be used to eventually build the most accurate three-dimensional map of the positions, motions, and chemical composition of stars in our Galaxy,” according to a project document. “By reconstructing the properties and past trajectories of all the stars probed by Gaia, astronomers will be able to delve deep into the history of our Galaxy’s formation and evolution.”

The entire project would be impossible were it not for advances in cloud computing storage,  big data analysis and artificial intelligence systems during this decade. The storage demands alone are mind-boggling. The ESA roped in cloud data services company NetApp, which focuses on management of applications and data across cloud and on-premise environments.

NetApp was previously involved with the Rosetta space mission, which landed a spacecraft on a comet in 2016. Lauched as far back as 2004, ten years later it became the first spacecraft to go into orbit around a comet, and its lander made the first successful landing on a comet.

“For the next two years Rosetta was following the comet and streaming data,” says Morne Bekker, NetApp South African country manager. “But with the comet speeding away from the sun at 120 000kph, Rosetta would soon lose solar power. Scientists seized the opportunity to attempt what no one had ever tried before — to gather unique observations through a controlled impact with the comet. Despite blistering speeds and countless unknowns, the spacecraft landed just 33m from its target point. 

“It’s quite phenomenal when you think of the data and analytics harvested, and the information it can send back. Now we’re helping with the Gaia project. You can imagine how much data is being collected daily. The catalogue will probably end up at 2 Petabytes in size – that’s 2-million gigabytes. If you think of the minute points of data being extracted, obviously you have to be using AI and machine learning to analyse all of this.”

Ruben Alvarez, IT manager at the ESA, sums it up simply: “Data is everything. Our biggest challenge is processing of the data.”

unnamed

Naturally, ESA required absolute reliability from data storage. It also demanded almost infinite scalability to support the massive data requirements of past, present, and future missions. 

“We have a commitment to deliver data to different institutes in Europe on a daily basis,” says Alvarez. “Adding to the challenge, data from every mission must be accessible indefinitely. In the coming years, we will be launching new missions that will demand huge amounts of data. NetApp provided us with solutions that were scalable, even if we didn’t know in advance how much disk storage we were going to need.”

ESA says it expects to publish the full Gaia catalogue in 2020, making it available online to professional astronomers and the general public, with interactive, graphical interfaces.

The catalogue, says Alvarez, will unlock many mysteries of the stars.

“We call our site the Library of the Universe because we keep the science archive of
all of our scientific missions. This is how we allow people to really investigate the universe. t’s all about the data.” 

The mission has tremendous scientific implications, but also makes a powerful business case for big data and cloud computing.

“The capabilities for AI and machine learning in the processing of mass amounts of data are far-reaching,” says Bekker. “Not only does it equate to extreme performance, but also to massive non-disruptive scalability where scientists can scale to 20 PB and beyond, to support the largest of learning data sets. Importantly it also allows scientists to expand their data where needed.”

Across Africa, the power of the cloud and big data is only slowly being harnessed. A new research project, Cloud Africa 2018, conducted by World Wide Worx for global networking application company F5 Networks, shows that cloud uptake is now pervasive across Kenya, Nigeria and South Africa.

However, the research reveals that each country experiences the benefits of the cloud differently. Respondents in Nigeria and Kenya named Business efficiency and Scalability by far the most important benefit, with 80% and 75% respectively selecting it as an advantage. Only 61% of South African respondents cited it.

The opposite happened with the most important benefit among South Africans: Time-to-market or speed of deployment came in as the most prominent, at 68% of respondents. In contrast, only 48% of companies in Kenya and 28% in Nigeria named it as a key benefit.

This appears to be a function of the infrastructure challenges in developing information technology markets like Nigeria and Kenya, where the cloud is used to overcome the obstacles that get in the way of efficiency.

In South Africa, where construction of the giant Square Kilometre Array multi radio telescope is due to begin next year, the learnings of Rosetta and Gaia will ensure that data collection, storage and analysis will no longer be a challenge.

  • Arthur Goldstuck is founder of World Wide Worx and editor-in-chief of Gadget.co.za. Follow him on Twitter on @art2gee and on YouTube

Featured

Legion gets a pro makeover

Lenovo’s latest Legion gaming laptop, the Y530, pulls out all the stops to deliver a sleek looking computer at a lower price point, writes BRYAN TURNER

Published

on

Gaming laptops have become synonymous with thick bodies, loud fans, and rainbow lights. Lenovo’s latest gaming laptop is here to change that.

The unit we reviewed housed an Intel Core i7-8750H, with an Nvidia GeForce GTX 1060 GPU. It featured dual storage, one bay fitted with a Samsung 256GB NVMe SSD and the other with a 1TB HDD.

The latest addition to the Legion lineup has become far more professional-looking, compared to the previous generation Y520. This trend is becoming more prevalent in the gaming laptop market and appeals to those who want to use a single device for work and play. Instead of sporting flashy colours, Lenovo has opted for an all-black computer body and a monochromatic, white light scheme. 

The laptop features an all-metal body with sharp edges and comes in at just under 24mm thick. Lenovo opted to make the Y530’s screen lid a little shorter than the bottom half of the laptop, which allowed for more goodies to be packed in the unit while still keeping it thin. The lid of the laptop features Legion branding that’s subtly engraved in the metal and aligned to the side. It also features a white light in the O of Legion that glows when the computer is in use.

The extra bit of the laptop body facilitates better cooling. Lenovo has upgraded its Legion fan system from the previous generation. For passive cooling, a type of cooling that relies on the body’s build instead of the fans, it handles regular office use without starting up the fans. A gaming laptop with good passive cooling is rare to find and Lenovo has shown that it can be achieved with a good build.

The internal fans start when gaming, as one would expect. They are about as loud as other gaming laptops, but this won’t be a problem for gamers who use headsets.

Click here to read about the screen quality, and how it performs in-game.

Previous Page1 of 3

Continue Reading

Featured

Serious about security? Time to talk ISO 20000

Published

on

By EDWARD CARBUTT, executive director at Marval Africa

The looming Protection of Personal Information (PoPI) Act in South Africa and the introduction of the General Data Protection Regulation (GDPR) in the European Union (EU) have brought information security to the fore for many organisations. This in addition to the ISO 27001 standard that needs to be adhered to in order to assist the protection of information has caused organisations to scramble and ensure their information security measures are in line with regulatory requirements.

However, few businesses know or realise that if they are already ISO 20000 certified and follow Information Technology Infrastructure Library’s (ITIL) best practices they are effectively positioning themselves with other regulatory standards such as ISO 27001. In doing so, organisations are able to decrease the effort and time taken to adhere to the policies of this security standard.

ISO 20000, ITSM and ITIL – Where does ISO 27001 fit in?

ISO 20000 is the international standard for IT service management (ITSM) and reflects a business’s ability to adhere to best practice guidelines contained within the ITIL frameworks. 

ISO 20000 is process-based, it tackles many of the same topics as ISO 27001, such as incident management, problem management, change control and risk management. It’s therefore clear that if security forms part of ITSM’s outcomes, it should already be taken care of… So, why aren’t more businesses looking towards ISO 20000 to assist them in becoming ISO 27001 compliant?

The link to information security compliance

Information security management is a process that runs across the ITIL service life cycle interacting with all other processes in the framework. It is one of the key aspects of the ‘warranty of the service’, managed within the Service Level Agreement (SLA). The focus is ensuring that the quality of services produces the desired business value.

So, how are these standards different?

Even though ISO 20000 and ISO 27001 have many similarities and elements in common, there are still many differences. Organisations should take cognisance that ISO 20000 considers risk as one of the building elements of ITSM, but the standard is still service-based. Conversely, ISO 27001 is completely risk management-based and has risk management at its foundation whereas ISO 20000 encompasses much more

Why ISO 20000?

Organisations should ask themselves how they will derive value from ISO 20000. In Short, the ISO 20000 certification gives ITIL ‘teeth’. ITIL is not prescriptive, it is difficult to maintain momentum without adequate governance controls, however – ISO 20000 is.  ITIL does not insist on continual service improvement – ISO 20000 does. In addition, ITIL does not insist on evidence to prove quality and progress – ISO 20000 does.  ITIL is not being demanded by business – governance controls, auditability & agility are. This certification verifies an organisation’s ability to deliver ITSM within ITIL standards.

Ensuring ISO 20000 compliance provides peace of mind and shortens the journey to achieving other certifications, such as ISO 27001 compliance.

Continue Reading

Trending

Copyright © 2019 World Wide Worx