Search the Community
Showing results for tags 'cloud computing'.
The AchieVer posted a topic in Security & Privacy NewsA Look Inside Cloud Computing: Benefits Versus Threats Cloud computing is the on-demand delivery of IT capabilities in which IT infrastructure and applications are provided to subscribers as a service over a network. Initially, cloud computing technology was introduced by Amazon in 2006 following the release of Elastic Compute Cloud (EC2), a key element of Amazon Web Services (AWS). Since then, the cloud effectively “conquered the world” due to a wide variety of benefits this technology provides to clients, such as economic (lower maintenance costs and larger storage facilities), operational (flexibility, efficiency, and automatic updates), and staffing (less IT staff required, better usage of shared resources). To further support these benefits, statistics reveal that this trend will last at least into the near future. Cloud Computing by the Numbers Financial results from Amazon illustrate that the company received the largest profit due to AWS, which generated 55 percent of the total operating profit of Amazon in the first quarter of 2018, despite only 12 percent going to the company’s net sales. Similarly, Gartner expects a 17.3 percent growth of the public cloud market services in 2019, totaling $206.2 billion, as shown in the table below. Gartner public cloud revenue forecast from 2017 to 2021. (Source: Gartner) In addition to the forecast above, according to the IDC FutureScape 2018 prediction, allocations on cloud services and cloud-enabling hardware, software, and services will reach over $530 billion by 2021. Similarly, according to IDG’s 2018 Cloud Computing Survey, 73 percent of organizations have at least one application or some part of their infrastructure in the cloud, and a further 17 percent expect to be at this level within the next 12 months. Cloud Computing Service Reference Architecture Unlike traditional business-to-consumer (B2C) relations, the cloud computing service has more complex and diversified architecture and includes the following components, defined by the National Institute of Standards and Technology (NIST): Cloud Provider Cloud Consumer Cloud Carrier Cloud Auditor Cloud Broker Cloud computing reference architecture. (Source: NIST) From a legal perspective, it’s necessary to point out other key players — governments and international organizations. Due to a tremendous amount of stored data and its sensitive nature, many countries strictly regulate issues related to data storage and its distribution within a legal framework. The General Data Protection Regulation (GDPR) law within the European Union and European Economic Area implemented in 2018 outlines rules regulating all aspects of data usage, obligatory for all countries (members of the aforementioned integration formations). According to this law, organizations must report any data breaches within 72 hours. In case of violations, they may be a subject to a fine of up to €20 million, or up to four percent of their net worldwide turnover of the preceding financial year. Cloud computing services are large-scale and elastic systems with multiple clients. Despite serious organizational, technical, and legal efforts, as well as the allocation of financial resources to prevent cloud data breaches, both providers and clients of cloud technologies are subjects to multiple malicious actions. These actions may include both traditional cyber threats and cloud-specific attacks, which may cause global, financial, reputational, and technological losses. Notable Breaches Many major companies have had their cloud data breached: Microsoft Dropbox National Electoral Institute of Mexico LinkedIn The Home Depot Yahoo World Wrestling Entertainment (WWE) As a result of these attacks, hundreds of millions of users’ credentials and other personally identifiable information (PII) were sold on the dark web and used by cybercriminals in various fraudulent ways. Insikt Group recently revealed the identityof notorious hacker tessa88, a dark web seller of compromised Dropbox, LinkedIn, and Yahoo accounts, who could be potentially affiliated with the hacker groups that were behind the attacks on the aforementioned corporations. Some of the most traditional cloud computer cyber threats include: Data breach or loss Abuse of cloud services Insecure interface and APIs Malicious insiders Illegal access to cloud systems Privilege escalation Natural disasters Hardware failure Supply chain failure Modifying network traffic Authentication attacks Malware attacks Loss of encryption keys Compliance risks Cloud-Specific Cyber Threats The primary threat to cloud computing services is the economic denial of sustainability (EDoS), a variation of the regular denial-of-service (DoS) or distributed denial-of-service (DDoS) attacks. Cloud computing services operate according to the service-level agreement (SLA) between the cloud provider and the client. This document stipulates the level of service defined by the client, which the cloud provider delivers. The number of resources provided to the client is charged by the provider proportionally. In this case, when fraudsters launch a DDoS attack on a cloud client, the cloud provider is responsible for responding to it in a timely manner and allocate as many financial resources as needed to neutralize the attack. Thus, the DDoS attack turns into a EDoS attack with direct negative economical input. In this situation, the client may choose another less expensive technical solution rather than cloud computing to continue operations, which means a serious reputational loss for the cloud provider. The Value of Threat Intelligence Cloud computing is a widespread, convenient, and perspective technology — one that still requires new technological and organizational solutions to secure its clients. When implemented into cloud computing systems, threat intelligence allows organizations to foresee fraudulent actions and alert top management, security teams, and clients in a timely manner. Recorded Future applies artificial intelligence to prevent and defend against the potential threats discussed above, providing quality service to its clients at the highest level. Source
steven36 posted a topic in Technology NewsBERLIN/PARIS (Reuters) - France and Germany threw their weight on Thursday behind plans to create a cloud computing ecosystem that seeks to reduce Europe’s dependence on Silicon Valley giants Amazon, Microsoft and Google. The project, dubbed Gaia-X, will establish common standards for storing and processing data on servers that are sited locally and comply with the European Union’s strict laws on data privacy. German Economy Minister Peter Altmaier, speaking in Berlin, described Gaia-X as a “moonshot” that would help reassert Europe’s technological sovereignty, and invited other countries and companies to join. “We are not China, we are not the United States, we are European countries with our own values and with our own economic interest that we want to defend,” his French counterpart Bruno Le Maire said in Paris in a joint video news conference. The initiative comes as France and Germany step up economic cooperation to offset the impact of the coronavirus pandemic. Both have backed an EU-wide recovery plan while Berlin has just announced a major fiscal stimulus. In an initial step, 22 French and German companies will set up a non-profit foundation to run Gaia-X, which is not conceived as a direct rival to the “hyperscale” U.S. cloud providers but would instead referee a common set of European rules. “Building a European-based alternative is possible only if we play collectively,” said Michel Paulin, CEO of independent French cloud service provider OVHcloud. One important concept underpinning Gaia-X is “reversibility”, a principle that would allow users to easily switch providers. First services are due to be offered in 2021. That is already far too late, according to analysts at Gartner, who forecast that the global market for public cloud services will grow by 17% to $228 billion this year. “The leading cloud providers have already moved quickly to build up this market,” said Gartner analyst Rene Buest. Source
mood posted a topic in Technology NewsAWS: S3 storage now holds over 100 trillion objects AWS' S3 online storage service turns 15 years old and now stores over 100 trillion objects. Amazon Web Services' (AWS) cloud storage platform S3 or Simple Storage Service today stores over 100 trillion objects. AWS's Jeff Barr revealed the figure to mark S3's fifteen year anniversary. AWS launched S3 publicly on March 14, 2006, four years after Amazon launched Amazon.com Web Services, although that was far from the cloud infrastructure service AWS is today. S3 was AWS' first generally available service that promised developers cheap storage based on storage per month used. Five months later AWS launched Elastic Cloud Compute (EC2), offering developers compute resources as well. S3 has grown tremendously by object count over the past decade. AWS S3 hit one trillion objects in 2012 while Microsoft, which launched Azure in October 2008, had four trillion objects in that year. Barr recalls that AWS started S3's API started with a simple design. "Create a bucket, list all buckets, put an object, get an object, and put an access control list," notes Barr. Barr also says S3 is designed to provide "eleven 9's of durability" meaning that an object stored in S3 has a durability of 99.999999999%. In the 15 years since S3's launch, AWS has introduced a host of new services such as the S3 Glacier Deep Archive, a store for large volumes of data that isn't accessed often, various data replication services, security features, and its Snowmobile shipping container for migrating petabytes of data from on-premise data centers to AWS. Barr notes that AWS recently "dramatically" reduced latency for 0.01% of the Put requests to S3. "While this might seem like a tiny win, it was actually a much bigger one," Barr explains, as it helped avoid customer requests that time out and retry. Another benefit was that gave developers insights need to reduce latency. AWS today remains the largest cloud infrastructure provider with quarterly revenues exceeding $12 billion with a $46 billion annual run rate. It's also become a star performer within Amazon, with former AWS CEO Andy Jassy recently taking over as chief of Amazon from Amazon founder Jeff Bezos. Source: AWS: S3 storage now holds over 100 trillion objects
The offshoot of Amazon’s online bookstore has led the public cloud market for a decade. How did it get there? Will its dominance continue? The rumors of Amazon Web Services’ fall from the pinnacle were premature. In the push to democratize cloud computing services, AWS had the jump on everyone from the beginning, ever since it was spun out of the mega retailer Amazon in 2002 and launched the flagship S3 storage and EC2 compute products in 2006. It still does. AWS quickly grew into a company that fundamentally transformed the IT industry and carved out a market-leading position, and has maintained that lead — most recently pegged by Synergy Research at almost double the market share of its nearest rival Microsoft Azure, with 33 percent of the market to Microsoft’s 18 percent. Market tracker data from IDC for the second half of 2019 also puts AWS in a clear lead, with 13.2 percent of the public cloud services market, narrowly ahead of Microsoft with 11.7 percent. As with any business, Amazon’s cloud success comes down to a confluence of factors: good timing, solid technology, and a parent company with deep enough pockets to make aggressive capital investments early on. There are other, unique factors that have led to the success of AWS, however, including a relentless customer focus, a ruthless competitive streak, and continued commitment to “dogfooding,” or eating your own dog food — a perhaps unfortunate turn of phrase that has proliferated through the tech industry since the late eighties. Dogfooding refers to a company making a bet on its own technology — in Amazon’s case by making it publicly available as a product or service. This is what Amazon did with S3 and EC2 in 2006, and it’s what Amazon has been doing with almost all of its AWS product launches since. We asked the experts how AWS has been able to dominate the public cloud market to date, and, with worldwide adoption of cloud services due to continue climbing, according to the 2020 IDG Cloud Computing Survey, whether AWS can stay on top of the pile for years to come. First-mover advantage There is no escaping the fact that Amazon’s jump on the competition has put them in the ascendancy from day one, giving them a six-year head start over its nearest competitor, Microsoft Azure. These years didn’t just help position AWS as the dominant cloud computing service provider in people’s minds, it also furnished the company with years of feedback to crunch through and better serve its customer base of software developers, engineers, and architects. “They invented the market space, there wasn’t the concept of public cloud like this before,” Dave Bartoletti, vice president and principal analyst at Forrester said. “We have been renting computing services for 30 or 40 years. Really what AWS did was establish in a corporate environment for a developer or IT person to go to an external service and start a server with a credit card and do computing somewhere else.” As Bartoletti notes, AWS wasn’t just first to market, it also had the deep pockets of its parent company, allowing it to blow anyone else out of the water. “They outspent their rivals,” he bluntly assessed. That being said, not all first-movers lead their market as definitively as AWS is — just ask the founders of Netscape. “Early movers don’t always have an advantage,” Deepak Mohan, research director for cloud infrastructure services at IDC, said, noting that AWS was especially rigorous in creating and bringing products to market. “Being a high-quality company and delivering a high-quality product and being responsive to customer needs all play equally important parts.” A special relationship Mohan points to Amazon’s superior ability to “eat its own dog food” as a key driver towards its success, as the cloud division had to address significant technology challenges faced by the huge ramping up of scale Amazon was seeing in the aftermath of the dotcom bubble bursting. “You have to consider the relationship between AWS and Amazon the e-commerce company,” said Ed Anderson, distinguished VP analyst at Gartner — which has AWS as its clear leader in its latest Magic Quadrant for Cloud Infrastructure and Platform Services. Just as customers of Google Cloud today want to “run like Google,” early AWS customers wanted to leverage the technology that had enabled Amazon to grow into an e-commerce giant so quickly. “A hallmark of AWS has been how technical and capable it has been,” Anderson notes. “And being really oriented around that ‘builder’ audience of developers, implementers, and architects,” he adds. “As a consequence, the sales team is very technical and capable in having those conversations, which means the experience customers have is really smooth.” Customer obsession It is that attention to customer needs that has long been a hallmark of the AWS value proposition, even if they don’t always get it right. As Amazon founder and CEO Jeff Bezos wrote in a 2016 letter to shareholders: “Customers are always beautifully, wonderfully dissatisfied, even when they report being happy and business is great. Even when they don’t yet know it, customers want something better, and your desire to delight customers will drive you to invent on their behalf.” It is this attention to what customers want — and don’t yet know what they want, to paraphrase Steve Jobs, by way of Henry Ford — which has been codified in Amazon’s leadership principles. “Leaders start with the customer and work backwards. They work vigorously to earn and keep customer trust. Although leaders pay attention to competitors, they obsess over customers,” Amazon’s leadership principles state. “That is a value I see exhibited over and over at AWS,” Anderson at Gartner observes. “This attention to customer requirements and the needs of builders and developers and architects, that has prioritized the features they built and is tightly aligned.” “They are incredibly customer focused and everything they build is driven by the customer,” Bartoletti at Forrester adds.” To maintain that as their large pool of customers continues growing gives them the advantage of knowing what their customers want.” Take the 2019 release of the hybrid cloud product AWS Outposts as an example. Instead of squaring neatly with Amazon’s public cloud-centric view of the world, Outposts met the customer’s needs in a different sphere — their on-prem data centers. Everything services-first A key move made by Bezos in the early days of commercial cloud computing was formalizing the way AWS would build and expose products to its customers. Referencing an early-2000s internal email mandate from Bezos, former Amazon and Google engineer Steve Yegee paraphrased in his Google Platforms Rant, from 2011, that: “All teams will henceforth expose their data and functionality through service interfaces. Teams must communicate with each other through these interfaces.” Lastly, “Anyone who doesn’t do this will be fired,” Yegge added. With this mandate, Bezos spurred the creation of an enormous service-oriented architecture, with business logic and data accessible only through application programming interfaces (APIs). “From the time Bezos issued his edict through the time I left [in 2005], Amazon had transformed culturally into a company that thinks about everything in a services-first fashion. It is now fundamental to how they approach all designs, including internal designs for stuff that might never see the light of day externally,” Yegge wrote. The enormous service-oriented architecture had effectively transformed an infrastructure for selling books into an extensible, programmable computing platform. The online bookstore had become a cloud. The everything store for enterprise builders All of this has led to an unrivalled breadth and maturity of services available to AWS customers. And while Amazon had the jump on the competition, it hasn’t rested on its laurels, regularly pioneering new services in the public cloud, such as the cloud-based data warehouse Redshift, the high-performance relational database service Aurora, and the event-based serverless computing platform Lambda, after developing the latter service for its AI-driven virtual assistant Alexa. “Yes, Google Cloud and Microsoft have ‘closed the gap,’ but AWS is still more capable on breadth of offerings and the maturity of those individual services,” Anderson at Gartner says. “I would say when it comes to market perception, most customers feel Azure and AWS are effectively on par and Google slightly behind. In terms of pure capability, though, AWS is a more mature architecture and set of capabilities, and the breadth is wider.” At the AWS re:Invent conference in December of 2019, AWS said it had 175 services, with a wealth of options and flavors across compute, storage, database, analytics, networking, mobile, developer tools, management tools, IoT, security, and enterprise applications. “Without doubt the market leader, AWS often wins on developer functionality, due to the breadth of its services as a result of its first-mover advantage,” Nick McQuire, vice president of enterprise research at CSS Insight says. “AWS has also done a good job at translating its scale into economic benefits for customers, although there are times where cloud can be cost prohibitive.” This broad set of capabilities can also be seen as a negative for some, with the service catalog representing a dizzying maze of services and options, but this level of choice has also proved a great resource for engineers. Bartoletti at Forrester, who has called AWS the cloud “everything store” for enterprise builders, points to a key difference in approach. “AWS can have three to four different database services, and they don’t care which one you use, as long as you use it at Amazon,” he notes. “Traditionally vendors would have had to pick one and run with it. That makes AWS tough to compete with.” The next phase for cloud computing The age of AWS dominance shows no sign of slowing, but the competition is fierce. “Microsoft has been able to close the gap by being open source focused and commercializing that in their cloud as fast as AWS,” Bartoletti says. “Google is working hard not to over rotate on the bleeding edge and focus on helping enterprises migrate workloads to the cloud.” Breadth and maturity of services, underpinned by strong engineering chops and relentless customer focus, look to keep AWS ahead of the curve for some time. Now, the company’s ability to simplify the adoption of new technology for enterprise customers through managed services will be the litmus test for the next wave of cloud computing adoption. It will also determine how AWS will fare against the ongoing fierce competition from Microsoft Azure and Google Cloud. “I think it is far from given that AWS will always dominate the cloud market,” Mohan at IDC says. At the same time, he acknowledges that the competitors have a lot of catching up to do. “Google is still quite a ways behind, and Microsoft, while a force, has certain advantages in the enterprise market,” Mohan says. “It is conceivable that companies will get closer, but I don’t expect any substantial changes in the next few years... There is a leap in capacity and scale that is yet to be built. All of this gives [AWS] a clearly dominant position for now.” As Warren Buffet said, “Never bet against America.” And when it comes to the public cloud market, we’ve learned it would be just as foolish to bet against Amazon. Source