Connect with us

Featured

We have to rethink data

Published

on

In today’s era of global digitalization there are many examples that show that IT matters. Developments like cloud computing, the IoT and AI are proving that IT has again become a business driver, says WERNER VOGELS, CTO of Amazon.com.

How companies can use ideas from mass production to create business with data

Strategically, IT doesn’t matter. That was the provocative thesis of a much-talked-about article from 2003 in the Harvard Business Review by the US publicist Nicolas Carr. Back then, companies spent more than half of their entire investment for their IT, in a non-differentiating way. In a world in which tools are equally accessible for every company, they wouldn’t offer any competitive advantage – so went the argument. The author recommended steering investments toward strategically relevant resources instead. In the years that followed, many companies outsourced their IT activities because they no longer regarded them as being part of the core business.

A new age

Nearly 15 years later, the situation has changed. In today’s era of global digitalization there are many examples that show that IT does matter. Developments like cloud computing, the internet of things, artificial intelligence, and machine learning are proving that IT has (again) become a strategic business driver. This is transforming the way companies offer products and services to their customers today. Take the example of industrial manufacturing: in prototyping, drafts for technologically complex products are no longer physically produced; rather, their characteristics can be tested in a purely virtual fashion at every location across the globe by using simulations. The German startup SimScale makes use of this trend. The founders had noticed that in many companies, product designers worked in a very detached manner from the rest of production. The SimScale platform can be accessed through a normal web browser. In this way, designers are part of an ecosystem in which the functionalities of simulations, data and people come together, enabling them to develop better products faster.

Value-added services are also playing an increasingly important role for both companies and their customers. For example, Kärcher, the maker of cleaning technologies, manages its entire fleet through the cloud solution “Kärcher Fleet”. This transmits data from the company’s cleaning devices e.g. about the status of maintenance and loading, when the machines are used, and where the machines are located. The benefit for customers: Authorized users can view this data and therefore manage their inventories across different sites, making the maintenance processes much more efficient.

Kärcher benefits as well: By developing this service, the company gets exact insight into how the machines are actually used by its customers. By knowing this, Kärcher can generate new top-line revenue in the form of subscription models for its analysis portal.

More than mere support

These examples underline that the purpose of software today is not solely to support business processes, but that software solutions have broadly become an essential element in multiple business areas. This starts with integrated platforms that can manage all activities, from market research to production to logistics. Today, IT is the foundation of digital business models, and therefore has a value-added role in and of itself. That can be seen when sales people, for example, interact with their customers in online shops or via mobile apps. Marketers use big data and artificial intelligence to find out more about the future needs of their customers. Breuninger, a fashion department store chain steeped in tradition, has recognized this and relies on a self-developed e-commerce platform in the AWS Cloud. Breuninger uses modern templates for software development, such as Self-Contained Systems (SCS), so that it can increase the speed of software development with agile and autonomous teams and quickly test new features. Each team acts according to the principle: “You build it, you run it”. Hence, the teams are themselves responsible for the productive operation of the software. The advantage of this approach is that when designing new applications, there is already a focus on the operating aspects.

Value creation through data

In a digital economy, data are at the core of value creation, whereas physical assets are losing their significance in business models. Until 1992, the most highly valued companies in the S&P 500 Index were those that made or distributed things (for example the pharmaceutical industry, trade). Today, developers of technology (for example medical technology, software) and platform operators (social media enablers, credit card companies) are at the top. Also, trade with data contributes more to global growth than trade with goods. Therefore, IT has never been more important for strategy than it is now – not only for us, but for every company in the digital age. Anyone who wants to further develop his business digitally can’t do that today without at the same time thinking about which IT infrastructure, which software and which algorithms he needs in order to achieve his plans.

If data take center stage then companies must learn how to create added value out of it – namely by combining the data they own with external data sources and by using modern, automated analytics processes. This is done through software and IT services that are delivered through software APIs.

Companies that want to become successful and innovative digital players need to get better at building software solutions.We should ponder how we can organize the ‘production’ of data in such a way so that we ultimately come out with a competitive advantage. We need mechanisms that enable the mass production of data using software and hardware capabilities. These mechanisms need to be lean, seamless and effective. At the same time, we need to ensure that quality requirements can be met. Those are exactly the challenges that were solved for physical goods through the industrialization of manufacturing processes. A company that wants to industrialize ‘software production’ needs to find ideas on how to achieve the same kind of lean and qualitatively first-class mass production that has already occurred for industrial goods. And inevitably, the first place to look will be lean production approaches such as Kanban and Kaizen, or total quality management. In the 1980s, companies like Toyota revolutionized the production process by reengineering the entire organization and focusing the company on similar principles. Creating those conditions, both from an organizational and IT- standpoint, is one of the biggest challenges that companies face in the digital age.

Learn from lean

Can we transfer this success model to IT as well? The answer is yes. In the digital world, it is decisive to activate data-centric processes and continuously improve them. Thus, any obstacles that stand in the way of experimentation and the further development of new ideas should be removed as fast as possible. Every new IT project should be regarded as an idea that must go through a data factory – a fully equipped production site with common processes that can be easily maintained. The end-product is high-quality services or algorithms that support digital business models. Digital companies differentiate themselves through their ideas, data and customer relationships. Those that find a functioning digital business model the fastest will have a competitive edge. Above all, the barrier between software development and the operating business has to be overcome. The reason is that the success and speed and frequency of these experiments depend on the performance of IT development, and at the same time on the relevance of the solutions for business operations. Autoscout24 has gained an enormous amount of agility through its cloud solution. The company meanwhile has 15 autonomous interdisciplinary teams working constantly to test and explore new services. The main goal in all this is to have the possibility to quickly iterate experiments through the widest range of architectures, combine services with each other, and compare approaches.

In order to become as agile as Autoscout24, companies need a “machine” that produces ideas. Why not transfer the success formulas from industrial manufacturing and the principles of quality management to the creation of software?

German industrial companies in particular possess a manufacturing excellence that has been built up over many decades. Where applicable, they should do their best to transfer this knowledge to their IT, and in particular to their software development.

In many companies, internal IT knowhow has not developed fast enough in the last few years – quite contrary to the technological possibilities. Customers provide feedback online immediately after their purchase. Real-time analyses are possible through big data and software updates are generated daily through the cloud. Often, the IT organization and its associated processes couldn’t keep up. As a consequence, specialist departments with the structures of yesterday are supposed to fulfill customer requirements of tomorrow. Bringing innovative products and services quickly to market is not possible with long-term IT sourcing cycles. It’s no wonder that many of specialist departments try to circumvent their own IT department, for example by shifting activities to the cloud, which offers many powerful IT building blocks through easy-to-use APIs for which companies previously had to operate complicated software and infrastructure. Such a decentralized ‘shadow IT’ delivers no improvements. The end effect is that the complexity of the system increases, which is not efficient. This pattern should be broken. Development and Operations need to work hand in hand instead of working sequentially after each other, as in the old world. And ideally, this should be done in many projects running parallel. Under the heading of DevOps – the combination of “Development and Operations” – IT guru Gene Kim has described the core characteristics of this machinery.

Ensuring the flow

Kim argues that theorganization must be built around the customer benefit and that the flow of projects must be as smooth as possible. Hurdles that block the creation of client benefits should be identified and removed. At Amazon this starts by staffing projects with cross-functional and interdisciplinary teams as a rule. Furthermore, for the sake of agility the teams should not exceed a certain size. We have a rule that teams should be exactly the size that allows everyone to feel full after eating two (large!) pizzas. This approach reduces the number of necessary handovers, increases responsibility, and allows the team to provide customers with software faster.

Incorporating feedback

The earlier client feedback flows back into the “production process”, the better. In addition, companies must ensure that every piece of feedback is applied to future projects. To avoid getting lost in endless feedback loops, this should be organized in a lean way: Obtaining the feedback of internal and external stakeholders must by no means hamper the development process.

Learning to take risks

“Good intentions never work, you need good mechanisms to make anything happen,” says Jeff Bezos. For that, you need a corporate culture that teaches employees to experiment constantly and deliver. With every new experiment, one should risk yet another small step forward behind the previous step. At the same time, from every team we need data based on predefined KPIs about the impact of the experiments. And we need to establish mechanisms that take effect immediately if we go too far or if something goes wrong, for example if the solution never reached the customer.

Anyone who has tried this knows it’s not easy to start your own digital revolution in the company and keep the momentum going. P3 advises cellular operators and offers its customers access to data that provide information about the quality of cellular networks (for example signal strength, broken connection and the data throughput) – worldwide and independent of the network operator and cellular provider. This allows the customers to come up with measures in order to expand their networks or new offerings for a more efficient utilization of their capacity. By introducing DevOps tools, P3 can define an automated process that implements the required compute infrastructure in the AWS Cloud and deploys project-specific software packages with the push of a button. Moreover, the process definition can be revised by developers, the business or data scientists at any time, for example in order to develop new regions, add analytics software or implement new AWS services. Now P3 can focus fully on its core competence, namely developing its proprietary software. Data scientists can use their freed-up resources to analyze in real time data that are collected from around the world and put insights from the analysis at the disposal of their clients

The cloud offers IT limitless possibilities on the technical side, from which new opportunities have been born. But it’s becoming ever clearer what is required in order to make use of these opportunities. Technologies change faster than people. And individuals faster than entire organizations. Tackling these challenges is a strategic necessity. Changing the organization is the next bottleneck on the way to becoming a digital champion.

Continue Reading

Featured

Which IoT horse should you back?

The emerging IoT is evolving at a rapid pace with more companies entering the market. The development of new product and communication systems is likely to continue to grow over the next few years, after which we could begin to see a few dominant players emerge, says DARREN OXLEE, CTOf of Utility Systems.

Published

on

But in the interim, many companies face a dilemma because, in such a new industry, there are so many unknowns about its trajectory. With the variety of options available (particularly regarding the medium of communication), there’s the a question of which horse to back.

Many players also haven’t fully come to grips with the commercial models in IoT (specifically, how much it costs to run these systems).

Which communication protocol should you consider for your IoT application? Depends on what you’re looking for. Here’s a summary of the main low-power, wide area network (LPWAN) communications options that are currently available, along with their applicability:

SIGFOX 

SigFox has what is arguably the most traction in the LPWAN space, thanks to its successful marketing campaigns in Europe. It also has strong support from vendors including Texas Instruments, Silicon Labs, and Axom.

It’s a relatively simple technology, ultra-narrowband (100 Hz), and sends very small data (12 bytes) very slowly (300 bps). So it’s perfect for applications where systems need to send small, infrequent bursts of data. Its lack of downlink capabilities, however, could make it unsuitable for applications that require two-way communication.

LORA 

LoRaWAN is a standard governed by the LoRa Alliance. It’s not open because the underlying chipset is only available through Semtech – though this should change in future.

Its functionality is like SigFox: it’s primarily intended for uplink-only applications with multiple nodes, although downlink messages are possible. But unlike SigFox, LoRa uses multiple frequency channels and data rates with coded messages. These are less likely to interfere with one another, increasing the concentrator capacity.

RPMA 

Ingenu Technology Solutions has developed a proprietary technology called Random Phase Multiple Access (RPMA) in the 2.4 GHz band. Due to its architecture, it’s said to have a superior uplink and downlink capacity compared to other models.

It also claims to have better doppler, scheduling, and interference characteristics, as well as a better link budget of 177 dB compared to LoRa’s 157 dB and SigFox’s 149 dB. Plus, it operates in the 2.4 GHz spectrum, which is globally available for Wi-Fi and Bluetooth, so there are no regional architecture changes needed – unlike SigFox and LoRa.

LTE-M 

LTE-M (LTE Cat-M1) is a cellular technology that has gained traction in the United States and is specifically designed for IoT or machine‑to‑machine (M2M) communications.

It’s a low‑power wide‑area (LPWA) interface that connects IoT and M2M devices with medium data rate requirements (375 kb/s upload and download speeds in half duplex mode). It also enables longer battery lifecycles and greater in‑building range compared to standard cellular technologies like 2G, 3G, or LTE Cat 1.

Key features include:

·       Voice functionality via VoLTE

·       Full mobility and in‑vehicle hand‑over

·       Low power consumption

·       Extended in‑building range

NB-IOT 

Narrowband IoT (NB‑IoT or LTE Cat NB1) is part of the same 3GPP Release 13 standard3 that defined LTE Cat M1 – both are licensed as LPWAN technologies that work virtually anywhere. NB-IoT connects devices simply and efficiently on already established mobile networks and handles small amounts of infrequent two‑way data securely and reliably.

NB‑IoT is well suited for applications like gas and water meters through regular and small data transmissions, as network coverage is a key issue in smart metering rollouts. Meters also tend to be in difficult locations like cellars, deep underground, or in remote areas. NB‑IoT has excellent coverage and penetration to address this.

MY FORECAST

The LPWAN technology stack is fluid, so I foresee it evolving more over the coming years. During this time, I suspect that we’ll see:

1.     Different markets adopting different technologies based on factors like dominant technology players and local regulations

2.     The technologies diverging for a period and then converging with a few key players, which I think will be SigFox, LoRa, and the two LTE-based technologies

3.     A significant technological shift in 3-5 years, which will disrupt this space again

So, which horse should you back?

I don’t believe it’s prudent to pick a single technology now; lock-in could cause serious restrictions in the long-term. A modular, agile approach to implementing the correct communications mechanism for your requirements carries less risk.

The commercial model is also hugely important. The cellular and telecommunications companies will understandably want to maximise their returns and you’ll want to position yourself to share an equitable part of the revenue.

So: do your homework. And good luck!

Continue Reading

Featured

Ms Office hack attacks up 4X

Published

on

Exploits, software that takes advantage of a bug or vulnerability, for Microsoft Office in-the-wild hit the list of cyber headaches in Q1 2018. Overall, the number of users attacked with malicious Office documents rose more than four times compared with Q1 2017. In just three months, its share of exploits used in attacks grew to almost 50% – this is double the average share of exploits for Microsoft Office across 2017. These are the main findings from Kaspersky Lab’s Q1 IT threat evolution report.

Attacks based on exploits are considered to be very powerful, as they do not require any additional interactions with the user and can deliver their dangerous code discreetly. They are therefore widely used; both by cybercriminals looking for profit and by more sophisticated nation-backed state actors for their malicious purposes.

The first quarter of 2018 experienced a massive inflow of these exploits, targeting popular Microsoft Office software. According to Kaspersky Lab experts, this is likely to be the peak of a longer trend, as at least ten in-the-wild exploits for Microsoft Office software were identified in 2017-2018 – compared to two zero-day exploits for Adobe Flash player used in-the-wild during the same time period.

The share of the latter in the distribution of exploits used in attacks is decreasing as expected (accounting for slightly less than 3% in the first quarter) – Adobe and Microsoft have put a lot of effort into making it difficult to exploit Flash Player.

After cybercriminals find out about a vulnerability, they prepare a ready-to-go exploit. They then frequently use spear-phishing as the infection vector, compromising users and companies through emails with malicious attachments. Worse still, such spear-phishing attack vectors are usually discreet and very actively used in sophisticated targeted attacks – there were many examples of this in the last six months alone.

For instance, in late 2017, Kaspersky Lab’s advanced exploit prevention systems identified a new Adobe Flash zero-day exploit used in-the-wild against our customers. The exploit was delivered through a Microsoft Office document and the final payload was the latest version of FinSpy malware. Analysis of the payload enabled researchers to confidently link this attack to a sophisticated actor known as ‘BlackOasis’. The same month, Kaspersky Lab’s experts published a detailed analysis of СVE-2017-11826, a critical zero-day vulnerability used to launch targeted attacks in all versions of Microsoft Office. The exploit for this vulnerability is an RTF document containing a DOCX document that exploits СVE-2017-11826 in the Office Open XML parser. Finally, just a couple of days ago, information on Internet Explorer zero day CVE-2018-8174 was published. This vulnerability was also used in targeted attacks.

“The threat landscape in the first quarter again shows us that a lack of attention to patch management is one of the most significant cyber-dangers. While vendors usually issue patches for the vulnerabilities, users often can’t update their products in time, which results in waves of discreet and highly effective attacks once the vulnerabilities have been exposed to the broad cybercriminal community,” notes Alexander Liskin, security expert at Kaspersky Lab.

Other online threat statistics from the Q1, 2018 report include:

  • Kaspersky Lab solutions detected and repelled 796,806,112 malicious attacks from online resources located in 194 countries around the world.
  • 282,807,433 unique URLs were recognised as malicious by web antivirus components.
  • Attempted infections by malware that aims to steal money via online access to bank accounts were registered on 204,448 user computers.
  • Kaspersky Lab’s file antivirus detected a total of 187,597,494 unique malicious and potentially unwanted objects.
  • Kaspersky Lab mobile security products also detected:
    • 1,322,578 malicious installation packages.
    • 18,912 mobile banking Trojans (installation packages).

To reduce the risk of infection, users are advised to:

  • Keep the software installed on your PC up to date, and enable the auto-update feature if it is available.
  • Wherever possible, choose a software vendor that demonstrates a responsible approach to a vulnerability problem. Check if the software vendor has its own bug bounty program.

·         Use robust security solutions , which have special features to protect against exploits, such as Automatic Exploit Prevention.

·         Regularly run a system scan to check for possible infections and make sure you keep all software up to date.

  • Businesses should use a security solution that provides vulnerability, patch management and exploit prevention components, such as Kaspersky Endpoint Security for Business. The patch management feature automatically eliminates vulnerabilities and proactively patches them. The exploit prevention component monitors suspicious actions of applications and blocks malicious files executions.
Continue Reading

Trending

Copyright © 2018 World Wide Worx