Jump to content

Search the Community

Showing results for tags 'ibm'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • Site Related
    • News & Updates
    • Site / Forum Feedback
    • Member Introduction
  • News
    • General News
    • FileSharing News
    • Mobile News
    • Software News
    • Security & Privacy News
    • Technology News
  • Downloads
    • nsane.down
  • General Discussions & Support
    • Filesharing Chat
    • Security & Privacy Center
    • Software Chat
    • Mobile Mania
    • Technology Talk
    • Entertainment Exchange
    • Guides & Tutorials
  • Off-Topic Chat
    • The Chat Bar
    • Jokes & Funny Stuff
    • Polling Station


  • Drivers
  • Filesharing
    • BitTorrent
    • eDonkey & Direct Connect (DC)
    • NewsReaders (Usenet)
    • Other P2P Clients & Tools
  • Internet
    • Download Managers & FTP Clients
    • Messengers
    • Web Browsers
    • Other Internet Tools
  • Multimedia
    • Codecs & Converters
    • Image Viewers & Editors
    • Media Players
    • Other Multimedia Software
  • Security
    • Anti-Malware
    • Firewalls
    • Other Security Tools
  • System
    • Benchmarking & System Info
    • Customization
    • Defrag Tools
    • Disc & Registry Cleaners
    • Management Suites
    • Other System Tools
  • Other Apps
    • Burning & Imaging
    • Document Viewers & Editors
    • File Managers & Archivers
    • Miscellaneous Applications
  • Linux Distributions


  • General News
  • File Sharing News
  • Mobile News
  • Software News
  • Security & Privacy News
  • Technology News

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...

  1. IBM creates the world’s first 2 nm chip IBM's new 2 nm process offers transistor density similar to TSMC's next-gen 3 nm. This slide from IBM's preview announcement gives more detail on the new process design. First image of article image gallery. Please visit the source link to see all images. On Thursday, IBM announced a breakthrough in integrated circuit design: the world's first 2 nanometer process. IBM says its new process can produce CPUs capable of either 45 percent higher performance or 75 percent lower energy use than modern 7 nm designs. If you've followed recent processor news, you're likely aware that Intel's current desktop processors are still laboring along at 14 nm, while the company struggles to complete a migration downward to 10 nm—and that its rivals are on much smaller processes, with the smallest production chips being Apple's new M1 processors at 5 nm. What's less clear is exactly what that means in the first place. Originally, process size referred to the literal two-dimensional size of a transistor on the wafer itself—but modern 3D chip fabrication processes have made a hash of that. Foundries still refer to a process size in nanometers, but it's a "2D equivalent metric" only loosely coupled to reality, and its true meaning varies from one fabricator to the next. To get a better idea of how IBM's new 2 nm process stacks up, we can take a look at transistor densities, with production process information sourced from Wikichip and information on IBM's process courtesy of Anandtech's Dr. Ian Cutress. Cutress got IBM to translate "the size of a fingernail"—enough area to pack 50 billion transistors using the new process into 150 square millimeters. Manufacturer Example Process Size Peak Transistor Density (millions/sq mm) Intel Cypress Cove (desktop) CPUs 14 nm 45 Intel Willow Cove (laptop) CPUs 10 nm 100 AMD (TSMC) Zen 3 CPUs 7 nm 91 Apple (TSMC) M1 CPUs 5 nm 171 Apple (TSMC) next-gen Apple CPUs, circa 2022 3 nm ~292 (estimated) IBM May 6 prototype IC 2 nm 333 As you can see in the chart above, the simple "nanometer" metric varies pretty strenuously from one foundry to the next—in particular, Intel's processes sport a much higher transistor density than implied by the "process size" metric, with its 10 nm Willow Cove CPUs being roughly on par with 7 nm parts coming from TSMC's foundries. (TSMC builds processors for AMD, Apple, and other high-profile customers.) Although IBM claims that the new process could "quadruple cell phone battery life, only requiring users to charge their devices every four days," it's still far too early to ascribe concrete power and performance characteristics to chips designed on the new process. Comparing transistor densities to existing processes also seems to take some of the wind from IBM's sails. Comparing the new design to TSMC 7 nm is well and good, but TSMC's 5 nm process is already in production, and its 3 nm process, which has a very similar transistor density, is on track for production status next year. We don't yet have any announcements of real products in development on the new process. However, IBM currently has working partnerships with both Samsung and Intel, who might integrate this process into their own future production. Listing image by IBM IBM creates the world’s first 2 nm chip (To view the article's image gallery, please visit the above link)
  2. To Make These Chips More Powerful, IBM Is Growing Them Taller The company reveals a process that it says can cram two-thirds more transistors on a semiconductor, heralding faster and more efficient electronic devices. IBM says its new process can cram 50 billion transistors on a chip, two-thirds more than the best current technology. Courtesy of IBM Computer chips might be in short supply at the moment, but chipmakers will continue wringing more power out of them for a while yet it seems. Researchers at IBM have demonstrated a way to squeeze more transistors onto a chip, a feat of nanoscopic miniaturization that could significantly improve the speed and efficiency of future electronic devices. The engineering feat might also help the US regain some ground when it comes to minting the world’s most advanced chips, something that has become central to geopolitics, economic competition, and national security. Chips are critical for a growing array of products, and access to faster, more advanced chips is likely to fuel progress in critical areas including artificial intelligence, 5G, and biotechnology. IBM says 50 billion of the new transistors—the electronic switches that let chips perform logical operations and store data—could fit on a chip the size of a fingernail, two-thirds more than what was possible using the previous process. It says the chip could help a smartphone or laptop run 45 percent faster or consume only one-fourth of the energy of the previous best design. “It’s a tremendously exciting technology,” says Jesús del Alamo, a professor at MIT who specializes in novel transistor technologies. “It’s a completely new design that pushes forward the roadmap for the future.” Making the new transistor relies on not simply etching the features of a chip into silicon, but also building them on top of one another. Chipmakers first began crafting transistors in three dimensions in 2009 using a design called FinFET, in which electrons flow through thin vertical fins—rather than a flat surface—to pass through transistors. The IBM design takes this further, stacking transistors on top of one another in the form of nanosheets that run through a semiconducting material like the layers in a cake. Dario Gil, senior vice president and director of IBM research, says making the transistors required innovations at various stages of the manufacturing process. The work comes from IBM’s research lab in Albany, New York, where IBM collaborates with the State University of New York as well as leading chip manufacturing companies. IBM sold off its chipmaking business in 2014, but it continues to fund research on next generation chip materials, designs, and manufacturing techniques. The company plans to make money by licensing the technology to chipmakers. For decades, chipmakers have been focused on shrinking the size of components to wring more performance out of chips. Smaller scale allows more components to be packed onto a chip, improving efficiency and speed, but each new generation requires incredible engineering to perfect. The most advanced computer chips today are made using a process that involves etching features into silicon with extreme ultraviolet lithography (EUV), resulting in features smaller than the wavelength of visible light. The process is called “7 nanometer,” but it no longer refers to the size of components; instead, it reflects the generation of technology employed, because of the stacked transistors and other changes in chipmaking. The new IBM chip is three generations ahead, using a process dubbed 2 nanometers. IBM first demonstrated transistors made this way in 2017 at 5-nanometer process scale. The fact that it has taken four years to move to 2 nanometers shows the challenge of mastering the techniques involved. The world’s most advanced chip companies have begun making 5 nanometer chips using existing approaches, which appear to be nearing their limits. Dan Hutcheson, CEO of VLSI Research, an analyst firm, says fabricating the 3D components undoubtedly requires new manufacturing tricks. But “they’ve done the most difficult part. It’s a real milestone for the industry,” he says, adding that the performance improvements touted by IBM seem conservative. Chipmaking progress was most famously captured in Moore’s law, a rule of thumb named after Intel cofounder Gordon Moore which states that the number of transistors on a chip will double every two years or so. Technologists have feared the end of Moore’s law for a decade or more, as chipmakers pushed the limits of manufacturing technology and novel electronics effects. Meeting the engineering challenges of making new generations of chips can have seismic importance. Intel, once the world’s most advanced chipmaker and still the most sophisticated in the US, has fallen behind TSMC in Taiwan and Samsung in South Korea in recent years, after struggling to master use of EUV in manufacturing. The US has used sanctions to target China over cybersecurity and trade issues. The sanctions have prevented technology companies such as Huawei from buying the latest chips, a move that has reportedly led the company to consider selling off its smartphone business. “It is an important signal that the United States is not is not only not far behind, but in some instances it's actually ahead,” Hutcheson says. “IBM research group in Albany has really been one of the best centers for this type of research for the last 10 years.” In March, Intel’s new CEO, Pat Gelsinger, announced a turnaround plan, including an agreement to collaborate with IBM on research. Intel declined to comment on the IBM announcement. Recent events have served to illustrate the growing importance of silicon chips across the world economy. The economic shockwaves caused by the pandemic, combined with supply chain disruptions, stockpiling prompted by US chip sanctions, and growing demand for cutting edge chips in products have led to shortages across many industries. Carmakers who expected demand for new vehicles to fall during the pandemic have been especially hard hit, with many forced to shutter factories while they wait for chip supplies. Del Alamo at MIT says it will probably take chipmakers several years to master the tricks that IBM used to make the new transistors. Both Samsung and TSMC, the world’s leading chip makers alongside Intel, have signaled an intent to use nanosheet transistors, but have yet to do so in production. But del Alamo believes the new approach shows that Moore’s law can keep ticking along. “There’s quite a bit of life left in Moore’s law, and this IBM architecture shows the path forward,” he says. “It’s going to bring very serious manufacturing challenges and a learning curve, but once we overcome this initial, difficult step, we will be coasting for several generations.” To Make These Chips More Powerful, IBM Is Growing Them Taller (May require free registration)
  3. IBM creates a COBOL compiler – for Linux on x86 What’s this got to do with Big Blue's hybrid cloud obsession? Cloudifying COBOL ... until you repent and go back to z/OS IBM has announced a COBOL compiler for Linux on x86. News of the offering appeared in an announcement that states: "IBM COBOL for Linux on x86 1.1 brings IBM's COBOL compilation technologies and capabilities to the Linux on x86 environment," and describes it as "the latest addition to the IBM COBOL compiler family, which includes Enterprise COBOL for z/OS and COBOL for AIX." COBOL – the common business-oriented language – has its roots in the 1950s and is synonymous with the mainframe age and difficulties paying down technical debt accrued since a bygone era of computing. So why is IBM – which is today obsessed with hybrid clouds – bothering to offer a COBOL compiler for Linux on x86? Because IBM thinks you may want your COBOL apps in a hybrid cloud, albeit the kind of hybrid IBM fancies, which can mean a mix of z/OS, AIX, mainframes, POWER systems and actual public clouds. COBOL shops have been promised that "minimal customization effort and delivery time are required for strategically deploying COBOL/CICS applications developed for z/OS to Linux on x86 and cloud environments." The new offering does that by linking to DB2 and IBM's Customer Information Control System so that apps on Linux using x86 can chat with older COBOL apps. Big Blue has also baked in native XML support to further help interoperability, and created a conversion utility that can migrate COBOL source code developed with non-IBM COBOL compilers. But the announcement also suggests IBM doesn't completely believe this COBOL on x86 Linux caper has a future as it concludes: "This solution also provides organizations with the flexibility to move workloads back to IBM Z should performance and throughput requirements increase, or to share business logic and data with CICS Transaction Server for z/OS." The new offering requires RHEL 7.8 or later, or Ubuntu Server 16.04 LTS, 18.04 LTS, or later. Source: IBM creates a COBOL compiler – for Linux on x86
  4. IBM today, for the first time, published its road map for the future of its quantum computing hardware. There is a lot to digest here, but the most important news in the short term is that the company believes it is on its way to building a quantum processor with more than 1,000 qubits — and somewhere between 10 and 50 logical qubits — by the end of 2023. Currently, the company’s quantum processors top out at 65 qubits. It plans to launch a 127-qubit processor next year and a 433-qubit machine in 2022. To get to this point, IBM is also building a completely new dilution refrigerator to house these larger chips, as well as the technology to connect multiple of these units to build a system akin to today’s multi-core architectures in classical chips. IBM’s Dario Gil tells me that the company made a deliberate choice in announcing this road map and he likened it to the birth of the semiconductor industry. “If you look at the difference of what it takes to build an industry as opposed to doing a project or doing scientific experiments and moving a field forward, we have had a philosophy that what we needed to do is to build a team that did three things well, in terms of cultures that have to come together. And that was a culture of science, a culture of the road map, and a culture of agile,” Gil said. He argues that to reach the ultimate goal of the quantum industry, that is, to build a large-scale, fault-tolerant quantum computer, the company could’ve taken two different paths. The first would be more like the Apollo program, where everybody comes together, works on a problem for a decade and then all the different pieces come together for this one breakthrough moment. “A different philosophy is to say, ‘what can you do today’ and put the capability out,” he said. “And then have user-driven feedback, which is a culture of agile, as a mechanism to continue to deliver to a community and build a community that way, and you got to lay out a road map of progress. We are firm believers in this latter model. And that in parallel, you got to do the science, the road map and the feedback and putting things out.” But he also argues that we’ve now reached a new moment in the quantum industry. “We’ve gotten to the point where there is enough aggregate investment going on, that is really important to start having coordination mechanisms and signaling mechanisms so that we’re not grossly misallocating resources and we allow everybody to do their piece.” He likens it to the early days of the semiconductor industry, where everybody was doing everything, but over time, an ecosystem of third-party vendors sprung up. Today, when companies introduce new technologies like Extreme Ultraviolet lithography, the kind of road maps that IBM believes it is laying out for the quantum industry today help every coordinate their efforts. He also argues that the industry has gotten to the point where the degree of complexity has increased so much that individual players can’t do everything themselves anymore. In turn, that means various players in the ecosystem can now focus on specializing and figuring out what they are best at. “You’re gonna do that, you need materials? The deposition technology? Then in that, you need the device expertise. How do you do the coupling? How do you do the packaging? How do you do the wiring? How do you do the amplifiers, the cryogenics, room temperature electronics, then the entire software stack from bottom to top? And on and on and on. So you can take the approach of saying, ‘well, you know, we’re going to do it all.’ Okay, fine, at the beginning, you need to do all to integrate, but over time, it’s like, should we be in the business of doing coaxial cabling?” We’re already seeing some of that today, with the recent collaboration between Q-CTRL and Quantum Machines, for example. Gil believes that 2023 will be an inflection point in the industry, with the road to the 1,121-qubit machine driving improvements across the stack. The most important — and ambitious — of these performance improvements that IBM is trying to execute on is bringing down the error rate from about 1% today to something closer to 0.0001%. But looking at the trajectory of where its machines were just a few years ago, that’s the number the line is pointing toward. But that’s only part of the problem. As Gil noted, “as you get richer and more sophisticated with this technology, every layer of the stack of innovation ends up becoming almost like an infinite field.” That’s true for the semiconductor industry and maybe even more so for quantum. And as these chips become more sophisticated, they also become larger — and that means that even the 10-foot fridge IBM is building right now won’t be able to hold more than maybe a million qubits. At that point, you have to build the interconnects between these chambers (because when cooling one chamber alone takes almost 14 days, you can’t really experiment and iterate at any appreciable speed). Building that kind of “quantum intranet,” as Gil calls it, is anything but trivial, but will be key to building larger, interconnected machines. And that’s just one of the many areas where inventions are still required — and it may still take a decade before these systems are working as expected. “We are pursuing all of these fronts in parallel,” Gil said. “We’re doing investments with horizons where the device and the capability is going to come a decade from now […], because when you have this problem and you only start then, you’ll never get there.” While the company — and its competitors — work to build the hardware, there are also plenty of efforts in building the software stack for quantum computing. One thing Gil stressed here is that now is the time to start thinking about quantum algorithms and quantum circuits, even if today, they still perform worse on quantum computers than classical machines. Indeed, Gil wants developers to think less about qubits than circuits. “When [developers] call a function and now it goes to the cloud, what is going to happen behind the scenes? There are going to be libraries of quantum circuits and there’s going to be a tremendous amount of innovation and creativity and intellectual property on these circuits,” explained Gil. And then, those circuits have to be mapped to the right quantum hardware and indeed, it looks like IBM’s vision here isn’t for a single kind of quantum processor but ones that have different layouts and topologies. “We are already, ourselves, running over a billion quantum circuits a day from the external world — over a billion a day,” Gil said. “The future is going to be where trillions of quantum circuits are being executed every day on quantum hardware behind the scenes through these cloud-enabled services embedded in software applications. ” Source
  5. IBM to split into two companies by end of 2021 As-yet unnamed “NewCo” will handle IBM’s “managed infrastructure services.” 93 with 73 posters participating, including story author Enlarge / A big blue Big Blue chart showing how this is going to work. IBM IBM announced this morning that the company would be spinning off some of its lower-margin lines of business into a new company and focusing on higher-margin cloud services. During an investor call, CEO Arvind Krishna acknowledged that the move was a "significant shift" in how IBM will work, but he positioned it as the latest in a decades-long series of strategic divestments. "We divested networking back in the '90s, we divested PCs back in the 2000s, we divested semiconductors about five years ago because all of them didn’t necessarily play into the integrated value proposition," he said. Krishna became CEO in April 2020, replacing former CEO Ginni Rometty (who is now IBM's executive chairman), but the spin-off is the capstone of a multi-year effort to apply some kind of focus to the company's sprawling business model. Cloudy with a chance of hitting the quarterly guidance The new spin-off doesn't have a formal name yet and is referred to as "NewCo" in IBM's marketing and investor relations material. Under the spin-off plan, the press release claims IBM "will focus on its open hybrid cloud platform, which represents a $1 trillion market opportunity," while NewCo "will immediately be the world’s leading managed infrastructure services provider." (This is because NewCo will start life owning the entirety of IBM Global Technology Services' existing managed infrastructure clients, which means about 4,600 accounts, including about 75 percent of the Fortune 100.) Enlarge / IBM does cloud and stuff, NewCo does infrastructure hosting and stuff. IBM The Reuters write-up of the split quotes Wedbush Securities analyst Moshe Katri, who categorizes the managed infrastructure business as something IBM is smart to dump: "IBM is essentially getting rid of a shrinking, low-margin operation given the cannibalizing impact of automation and cloud, masking stronger growth for the rest of the operation." Investors are reacting bullishly on the news of the 109-year-old company's plans. IBM stock is up approximately 7 percent for the day as of press time. IBM to split into two companies by end of 2021
  6. Adds 100 percent cashback tier if availability dips below 95 percent, removes reference to POWER8 servers BM has issued a new Cloud Service Description, the formal document that explains what Big Blue considers to be a cloud and how it’s effort will behave. The first item on the list of additions in the email sent to customers is: “a new tier to the Service Level Agreement (SLA) which provides a 100% refund on monthly charges, should a service miss a 95 percent availability target.” 95 percent availability means 36 hours of downtime in a month, which is not what clouds are supposed to do. Could IBM be softening up its users for more incidents like its June 2020 crash that took its entire cloud down for hours, including vital status information pages? Don’t panic, IBM cloud user: this isn’t Big Blue in any way saying that you can expect one-and-a-half nines reliability in its cloud. Instead it’s IBM catching up to rivals like Azure and AWS, which already have 100 percent refund tiers at 95 percent availability and have had them in place for over a year. All also require users to apply for those refunds, which probably isn’t what you’ll really feel like doing after losing a day-and-a-half of uptime! IBM’s new legalese also removes mobile apps from Big Blue’s definition of its cloud UI, leaving it with front-end options of “on-line portals, APIs, command line interfaces, or, where available, assisted ordering”. There’s also a new set of rules for notifications of price or SLA changes: in last year’s document IBM offered only a 30-day warning of “any changes to this Service Description.” This year’s model offers the more nuanced promise of “at at least 30 days' notice of any price increases or changes to this Service Description, and at least 90 days' notice for SLA changes.” A clause on Apple software running on iOS has been excised from the document, as have references to POWER8-powered servers. Presumably IBM has upgraded its POWER-powered cloud to newer processors. Source
  7. Quantum computing will bring about a sea change and provide the means to thwart existing defenses easily. IBM is pitching enterprises on future proofing. IBM Cloud said it will offer cryptography technology that will be futureproofed for quantum computing deployments. Big Blue, which is among the key players in the quantum computing race, launched Quantum Safe Cryptography for Key Management and Application Transactions. Quantum computing promises to solve new problems, leap past supercomputers and possibly used to easily break encryption algorithms and data security measures. BM's bet is that it can combine its security and hybrid cloud knowhow with its quantum computing research. The new tools under the quantum-safe effort from IBM include: Quantum Safe Crypto Support, a service to secure data transmissions between hardware externally and internally via a quantum-safe algorithm. Extended IBM Cloud Hyper Protect Crypto Service, a design to protect transactional data within applications. The protection covers encryption schemes in databases and digital signature validation. These services will support the following: IBM Key Protect and for Red Hat OpenShift on IBM Cloud; IBM Cloud Kubernetes Service; IBM Cloud Hyper Protect Crypto Services. The quantum security efforts add to IBM's existing portfolio including confidential computing, IBM Cloud Data Shield, research and IBM Cloud Security and Compliance Center. Source
  8. In collaboration with Pfizer, Big Blue has developed an AI model using speech samples provided by the Framingham Heart Study. IBM has partnered with pharmaceutical giant Pfizer to design an artificial intelligence (AI) model to predict the eventual onset of the neurological disease seven years before symptoms appear. Alzheimer's is currently incurable and is often diagnosed too late to prevent it from accelerating. Symptoms for the disease include the gradual degradation of memory, confusion, and difficulty in completing once-familiar daily tasks. Published in The Lancet eClinical Medicine, the researchers used small samples of language data from clinical verbal tests provided by the Framingham Heart Study, a long-term study that has been tracking the health of more than 5,000 people and their families since 1948, to train the AI models. The AI model's ability was then verified against data samples from a group of healthy individuals who eventually did and did not develop the disease later in life. For example, if the AI model analysed a speech sample from a participant at the age of 65 and predicted they would develop Alzheimer's by the age of 85, researchers were then able to check records to determine if and when a diagnosis had actually occurred. According to Big Blue, the outcome of this research was significantly better than predictions based on clinical scales, a prediction based on other available biomedical data from a patient, as that only had an accuracy rate of 59%. IBM added that unlike past studies, this new one focused on individuals that started to show symptoms or had a genetic history associated with the disease and only examined healthy individuals with no other risk factors. "In partnership with our colleagues from Pfizer, we saw the potential to develop AI models which -- if continued to be trained on expanded, robust and diverse datasets -- could one day be used to develop methods to more accurately predict Alzheimer's disease within a large population, including individuals with no current indicators of the disease, no family history of the disease, or signs of cognitive decline," IBM said. IBM said the ability to identify higher-risk patients could potentially lead to successful clinical trials for preventative therapies. "Ultimately, we hope this research will take root and aid in the future development of a more simple, straightforward and easily accessible tool to help clinicians assess a patient's risk of Alzheimer's disease, through the analysis of speech and language in conjunction with a number of other facets of an individual's health and biometrics," the company said. This latest research is part of IBM's ongoing research into Alzheimer's disease. In 2018, the tech giant introduced machine learning to the diagnostics field in the hopes that one day it could assist in the creation of stable and effective diagnostic tests for early-onset of the disease. Source
  9. Bloomberg) -- International Business Machines Corp. is planning to cut about 10,000 jobs in Europe in an attempt to lower costs at its slow-growth services unit and prepare the business for a spinoff. The wide-ranging losses will affect about 20% of staff in the region, according to people familiar with the matter. The U.K. and Germany are set to be most impacted, with cuts also planned in Poland, Slovakia, Italy and Belgium. IBM announced the job cuts in Europe earlier in November during a meeting with European labor representatives, according to a union officer briefed on proceedings. The person asked not to be identified because the talks are private. IBM shares fell 1.6% at 9:37 a.m. in New York. They’ve declined 8.6% this year. “Our staffing decisions are made to provide the best support to our customers in adopting an open hybrid cloud platform and AI capabilities,” an IBM spokeswoman said in an emailed statement. “We also continue to make significant investments in training and skills development for IBMers to best meet the needs of our customers.” Hardest hit will be IBM’s legacy IT services business, which handles day-to-day infrastructure operations, such as managing client data centers and traditional information-technology support for installing, operating and repairing equipment. IBM said in October it’s planning to spin off the business and focus on its new hybrid-cloud computing and artificial intelligence unit, which the company hopes will return it to revenue growth. IBM said it aims to complete the carve-out as a tax-free spinoff to IBM shareholders by the end of 2021. “We’re taking structural actions to simplify and streamline our business,” said IBM Chief Financial Officer James Kavanaugh during the company’s third-quarter earnings call in October. “We expect the fourth-quarter charge to our operating results of about $2.3 billion.” Once an iconic blue-chip company, IBM’s star has faded over the years as its legacy in mainframe computing and IT services fell behind while newer technology firms like Amazon.com Inc. swooped in to dominate the emerging cloud-computing market. IBM was already cutting jobs earlier this year, although the company wouldn’t say how many positions were being eliminated. The company has traditionally declined to disclose the numbers of job cuts for decades, with arguably one exception in 1993 when Lou Gerstner, a CEO hired from outside the company, announced 60,000 dismissals. The spin-off of its services unit is the first big move by Chief Executive Officer Arvind Krishna, who took over from Ginni Rometty in April and has been pushing to revive growth after almost a decade of shrinking revenue. Krishna earlier this year cut thousands of jobs as he began reshaping the business. The current round of job cuts should be completed by the end of the first-half of 2021, one of the people added. Source
  10. HPE and IBM were attacked by hackers working on behalf of the Chinese government, multiple sources have claimed. News of the attack, thought to be part of a long-running campaign known as Cloudhopper, was reported to Reuters by five sources, and targeted secrets both the tech giants themselves and their customers. Cloudhopper targets the companies known as managed service providers (MSPs) tasked by the likes of IBM and HPE with managing their IT operations remotely. The attack was able to successfully target the MSPs used by IBM and HPE to gain access to their client networks, and then steal customer information. The MSPs targeted by the attack have not been named, but could cover a range of roles with either firm, from networking to hardware such as servers or storage. Cloudhopper Reuters' sources have claimed that other major technology firms could also have been affected, as Cloudhopper has been in operation for several years. Neither HPE nor IBM have commented on the specific details of the attack, but did provide statements. “IBM has been aware of the reported attacks and already has taken extensive counter-measures worldwide as part of our continuous efforts to protect the company and our clients against constantly evolving threats,” IBM said. “We take responsible stewardship of client data very seriously, and have no evidence that sensitive IBM or client data has been compromised by this threat.” HPE noted that it had spun out its MSP operations to form a new business, DXC Technology, as part of the 2017 merger with Computer Sciences Corp. “The security of HPE customer data is our top priority,” HPE said. “We are unable to comment on the specific details described in the indictment, but HPE’s managed services provider business moved to DXC Technology in connection with HPE’s divestiture of its Enterprise Services business in 2017.” source
  11. By Joey Sneddon Wondering what Mark Shuttleworth thinks about IBM buying Red Hat? Well, wonder no more. The Ubuntu founder has shared his thoughts on IBM’s game-changing purchase in a short but pointed blog post. And, few of you will be surprised to learn, the space-faring free-software fan thinks the deal marks a “significant moment in the progression of open source to the mainstream”. And rightly so: there was a time when open source was viewed as the outside option. Now, thanks to companies like Red Hat and Canonical, it’s the de-facto option. Naturally Shuttleworth is also feeling bullish about Ubuntu’s position as a Red Hat rival, particularly in the area of cloud computing (the main market motivator behind IBM’s $34 billion buy). And, he adds, the world has moved on — even from Red Hat. “The decline in RHEL growth contrasted with the acceleration in Linux more broadly is a strong market indicator of the next wave of open source,” he writes. “Public cloud workloads have largely avoided RHEL. Container workloads even more so. Moving at the speed of developers means embracing open source in ways that have led the world’s largest companies, the world’s fastest moving startups, and those who believe that security and velocity are best solved together, to Ubuntu.” Shuttleworth says theres an ‘accelerated momentum’ behind Ubuntu within the enterprise space, in all areas, from IoT, public cloud and Kubernetes to machine learning and AI — all sectors IBM and Red Hat will be hoping its combined clout can carve more marketshare from. Companies aren’t just using Ubuntu. They’re choosing Ubuntu. It’s a confidence that won’t be knocked by IBM’s deal: “We are determined that Ubuntu is judged as the world’s most secure, most cost-effective and most faithful vehicle for open source initiatives. We look forward to helping [companies…] deliver the innovation on which their future growth depends.” While Mark Shuttleworth’s statement doesn’t strictly relate to desktop matters (the primary focus of this site) his take is worth hearing all the same. It’s reassuring to know that far from being intimated or downbeat about the biggest deal in open-source history, they feel Ubuntu still has plenty to offer. In a game of who can be the biggest, best and most bountiful open-source software company, can the wider FOSS community ever lose? Source
  12. steven36

    IBM’s Old Playbook

    The best way to understand how it is Red Hat built a multi-billion dollar business off of open source software is to start with IBM. Founder Bob Young explained at the All Things Open conference in 2014: Yesterday Young’s story came full circle when IBM bought Red Hat for $34 billion, a 60% premium over Red Hat’s Friday closing price. IBM is hoping it too to can come full circle: recapture Gerstner’s magic, which depended not only on his insight about services, but also a secular shift in enterprise computing. How Gerstner Transformed IBM I’ve written previously about Gerstner’s IBM turnaround in the context of Satya Nadella’s attempt to do the same at Microsoft, and Gerstner’s insight that while culture is extremely difficult to change, it is impossible to change nature. From Microsoft’s Monopoly Hangover: A strategy predicated on providing solutions, though, needs a problem, and the other thing that made Gerstner’s turnaround possible was the Internet. By the mid-1990s businesses were faced with a completely new set of technologies that were nominally similar to their IT projects of the last fifteen years, but in fact completely different. Gerstner described the problem/opportunity in Who Says Elephants Can’t Dance: Those of you my age or older surely remember what soon became IBM’s ubiquitous ‘e’: IBM went on to spend over $5 billion marketing “e-business”, an investment Gerstner called “one of the finest jobs of brand positions I’ve seen in my career.” It worked because it was true: large enterprises, most of which had only ever interacted with customers indirectly through a long chain of wholesalers and distributors and retailers suddenly had the capability — the responsibility, even — of interacting with end users directly. This could be as simple as a website, or e-commerce, or customer support, not to mention the ability to tap into all of the other parts of the value chain in real-time. The technology challenges and the business possibilities — the problem set, if you will — were immense, and Gerstner positioned IBM as the company that could solve these new problems. It was an attractive proposition for nearly all non-tech companies: the challenge with the Internet in the 1990s was that the underlying technologies were so varied and quite immature; different problem spaces had different companies hawking products, many of them startups with no experience working with large enterprises, and even if they had better products no IT department wanted to manage and integrate a multitude of vendors. IBM, on the other hand, offered the proverbial “one throat to choke”; they promised to solve all of the problems associated with this new-fangled Internet stuff, and besides, IT departments were familiar and comfortable with IBM. It was also a strategy that made sense in its potential to squeeze profit out of the value chain: \ The actual technologies underlying the Internet were open and commoditized, which meant IBM could form a point of integration and extract profits, which is exactly what happened: IBM’s revenue and growth increased steadily — often rapidly! — over the next decade, as the company managed everything from datacenters to internal networks to external websites to e-commerce operations to all the middleware that tied it together (made by IBM, naturally, which was where the company made most of its profits). IBM took care of everything, slowly locking its customers in, and once again grew fat and lazy. When IBM Lost the Cloud In the final paragraph of Who Says Elephants Can’t Dance? Gerstner wrote of his successor Sam Palmisano: Palmisano failed miserably, and there is no greater example than his 2010 announcement of the company’s 2015 Roadmap, which was centered around a promise of delivering $20/share in profit by 2015. Palmisano said at the time: Amazon Web Services, meanwhile, had launched a full four years and two months before Palmisano’s declaration; it was the height of folly to not simply mock the idea of the cloud, but to commit to a profit number in the face of an existential threat that was predicated on spending absolutely massive amounts of money on infrastructure. Gerstner identified exactly what it was that Palmisano got wrong: he was “inward-looking and self-absorbed” such that he couldn’t imagine an enterprise solution better than IBM’s customized solutions. That, though, was to miss the point. As I wrote in a Daily Update back in 2014 when the company formally abandoned the 2015 profit goal: The company has spent the years since then claiming it is committed to catching up in the public cloud, but the truth is that Palmisano sealed the company’s cloud fate when he failed to invest a decade ago; indeed, one of the most important takeaways from the Red Hat acquisition is the admission that IBM’s public cloud efforts are effectively dead. IBM’s Struggles So what precisely is the point of IBM acquiring Red Hat, and what if anything does it have to do with Lou Gerstner? Well first off, IBM hasn’t been doing very well for quite some time now: last year’s annual revenue was the lowest since 1997, part-way through Gerstner’s transformation; of course, as this ZDNet article from whence this graph comes points out, $79 billion in 1997 is $120 billion today. The company did finally return to growth earlier this year after 22 straight quarters of decline, only to decline again last quarter: IBM’s ancient mainframe business was up 2%, and its traditional services business, up 3%, but Technology Services and Cloud Platforms were flat, and Cognitive Solutions (i.e. Watson) was down 5%. Meanwhile, the aformentioned commitment to the cloud has mostly been an accounting fiction derived from re-classifying existing businesses; the more pertinent number is the company’s capital expenditures, which in 2017 were $3.2 billion, down from 2016’s $3.6 billion. Charles Fitzgerald writes on Platformonomics: The Red Hat Acquisition This is where the Red Hat acquisition comes in: while IBM will certainly be happy to have the company’s cash-generating RHEL subscription business, the real prize is Openshift, a software suite for building and managing Kubernetes containers. I wrote about Kubernetes in 2016’s How Google is Challenging AWS: This is exactly what IBM is counting on; the company wrote in its press release announcing the deal: This is the bet: while in the 1990s the complexity of the Internet made it difficult for businesses to go online, providing an opening for IBM to sell solutions, today IBM argues the reduction of cloud computing to three centralized providers makes businesses reluctant to commit to any one of them. IBM is betting it can again provide the solution, combining with Red Hat to build products that will seamlessly bridge private data centers and all of the public clouds. IBM’s Unprepared Mind The best thing going for this strategy is its pragmatism: IBM gave up its potential to compete in the public cloud a decade ago, faked it for the last five years, and now is finally admitting its best option is to build on top of everyone else’s clouds. That, though, gets at the strategy’s weakness: it seems more attuned to IBM’s needs than potential customers. After all, if an enterprise is concerned about lock-in, is IBM really a better option? And if the answer is that “Red Hat is open”, at what point do increasingly sophisticated businesses build it themselves? The problem for IBM is that they are not building solutions for clueless IT departments bewildered by a dizzying array of open technologies: instead they are building on top of three cloud providers, one of which (Microsoft) is specializing in precisely the sort of hybrid solutions that IBM is targeting. The difference is that because Microsoft has actually spent the money on infrastructure their ability to extract money from the value chain is correspondingly higher; IBM has to pay rent: Perhaps the bigger issue, though, goes back to Gerstner: before IBM could take advantage of the Internet, the company needed an overhaul of its culture; the extent to which the company will manage to leverage its acquisition of Red Hat will depend on a similar transformation. Unfortunately, that seems unlikely; current CEO Ginni Rometty, who took over the company at the beginning of 2012, not only supported Palmisano’s disastrous Roadmap 2015, she actually undertook most of the cuts and financial engineering necessary to make it happen, before finally giving up in 2014. Meanwhile the company’s most prominent marketing has been around Watson, the capabilities of which have been significantly oversold; it’s not a surprise sales are shrinking after disappointing rollouts. Gerstner knew turnarounds were hard: he called the arrival of the Internet “lucky” in terms of his tenure at IBM. But, as the Louis Pasteur quote goes, “Fortune favors the prepared mind.” Gerstner had identified a strategy and begun to change the culture of IBM, so that when the problem arrived, the company was ready. Today IBM claims it has found a problem; it is an open question if the problem actually exists, but unfortunately there is even less evidence that IBM is truly ready to take advantage of it if it does. Source
  13. Speculation that running joint venture with shipping giant Maersk might be off-putting to rivals IBM has admitted that its blockchain-based trade platform, set up with shipping giant Maersk, is struggling to gain traction with other carriers. The joint venture began about 10 months ago with the aim of simplifying the cost, complexity and size of global shipping networks, while offering more transparency and cutting the costs and time involved. The platform, named TradeLens, was officially launched in August. The product uses distributed ledger technology to establish a shared, immutable record of all the transactions that take place in the network, so the various trading parties can gain, with permissions, access to that data in real-time. Maersk’s Michael White said in a blogpost at the launch that this would tackle industry issues such as inconsistent data, “complex, cumbersome and often expensive peer-to-peer messaging” and “inefficient clearance processes”. IBM and Maersk began collaborating on blockchain in June 2016, and the reason for launching the joint venture was to allow them to commercialise the product. TradeLens – which is sold as an “open and neutral platform” – has had some successes in signing up port operators and customs authorities: in the summer, it named a group of almost 100 adopters, and just last week added the Port of Montreal to that list. It also announced that the Canada Border Services Agency, which processes more than 14,400 trucks and 127,400 courier shipments and collects more than CDN$88,200,000 in duty and taxes a day, was trialling TradeLens. However, if the platform is to be a success it needs to convince more container carriers to join, as this will allow traders to manage inventory across different carriers. And, with just one carrier – Asian firm Pacific International Lines – signed up, it is struggling. Even IBM is reported to have acknowledged the problem it is facing. As head of TradeLens at IBM Blockchain, Marvin Erdly, told blockchain publication CoinDesk: “We do need to get the other carriers on the platform. Without that network, we don't have a product. That is the reality of the situation.” There appears to be broad support for the principle of an industry-wide blockchain standard that can be used for ocean shipping, and so the companies are concerned that the prominent role of Maersk – the world’s largest container shipping company – is putting off rivals. "Obviously the fact that Maersk is driving this is both a really good thing and a worrying thing because they are such a big player in the industry,” Erdly is reported to have said. “As you can imagine that's going to be a factor." Indeed, Shipping Watch reported in May that execs at carrier giants Hapag Lloyd and CMA CGM had warned against platforms that one firm controlled, calling for wider governance. "Technically the solution (by Maersk and IBM) could be a good platform, but it will require a governance that makes it an industry platform and not just a platform for Maersk and IBM,” Hapag Lloyd CEO Rolf Jansen is reported to have told a conference. “This is the weakness we're currently seeing in many of these initiatives, as each individual project claims to offer an industry platform that they themselves control. This is self-contradictory.” IBM and Maersk do seem aware of the issue: Maersk has established an operational subsidiary to manage staff on the project, which the pair say “ensures TradeLens’ independence from other Maersk business units”. In addition, the duo say they are in the process of setting up an advisory board to work with TradeLens leaders “to address key issues such as the use of open and fair standards”. But the IP created from the work is jointly owned by IBM and Maersk – so the creation of a subsidiary and an advisory board could well be seen by the rest of the industry as sticking plasters not solutions. The Reg has asked IBM for further details on plans for the advisory board and any other measures it might have planned, and will update this article if we hear back. Updated - 30 October, 15.59GMT An IBM spokeswoman told us the company is taking the concerns about equity and governance on board and has worked with carriers to address them. “As a result, a range of carriers on both the global and regional level recognize the TradeLens solution,” she said. “Currently, discussions are progressing regarding potential pilots or full network participation with several of them.” The advisory board, the spokeswoman added, “will provide guidance and feedback to help drive open and fair standards for the TradeLens platform” Source
  14. At CES, IBM today announced its first commercial quantum computer for use outside of the lab. The 20-qubit system combines into a single package the quantum and classical computing parts it takes to use a machine like this for research and business applications. That package, the IBM Q system, is still huge, of course, but it includes everything a company would need to get started with its quantum computing experiments, including all the machinery necessary to cool the quantum computing hardware. While IBM describes it as the first fully integrated universal quantum computing system designed for scientific and commercial use, it’s worth stressing that a 20-qubit machine is nowhere near powerful enough for most of the commercial applications that people envision for a quantum computer with more qubits — and qubits that are useful for more than 100 microseconds. It’s no surprise then, that IBM stresses that this is a first attempt and that the systems are “designed to one day tackle problems that are currently seen as too complex and exponential in nature for classical systems to handle.” Right now, we’re not quite there yet, but the company also notes that these systems are upgradable (and easy to maintain). “The IBM Q System One is a major step forward in the commercialization of quantum computing,” said Arvind Krishna, senior vice president of Hybrid Cloud and director of IBM Research. “This new system is critical in expanding quantum computing beyond the walls of the research lab as we work to develop practical quantum applications for business and science.” More than anything, though, IBM seems to be proud of the design of the Q systems. In a move that harkens back to Cray’s supercomputers with its expensive couches, IBM worked with design studios Map Project Office and Universal Design Studio, as well Goppion, the company that has built, among other things, the display cases that house the U.K.’s crown jewels and the Mona Lisa. IBM clearly thinks of the Q system as a piece of art and, indeed, the final result is quite stunning. It’s a nine-foot-tall and nine-foot-wide airtight box, with the quantum computing chandelier hanging in the middle, with all of the parts neatly hidden away. If you want to buy yourself a quantum computer, you’ll have to work with IBM, though. It won’t be available with free two-day shipping on Amazon anytime soon. In related news, IBM also announced the IBM Q Network, a partnership with ExxonMobil and research labs like CERN and Fermilab that aims to build a community that brings together the business and research interests to explore use cases for quantum computing. The organizations that partner with IBM will get access to its quantum software and cloud-based quantum computing systems. Source
  15. IBM Warns of Apple Siri Shortcut Scareware Risk "Hey Siri" is supposed to be a voice command that enables Apple's digital assistant, but in the wrong hands the new Siri Shortcuts feature could potentially be abused by an attacker. Apple's Siri voice assistant is intended to help users, but according to new research published by IBM on Jan. 31, attackers could potentially abuse the Siri Shortcuts feature. Apple introduced Siri Shortcuts with iOS 12, enabling users and developers to use Siri to automate a series of tasks. IBM's X-Force security division discovered that it is possible to use a Siri Shortcut for malicious purposes, including tricking a user into paying a fee to avoid having his or her information stolen in an attack known as scareware. In a proof-of-concept Siri Shortcuts scareware attack developed by IBM, a malicious shortcut is able to read information from an iOS device and then demand a fee from the user, all with the native Siri voice. "IBM X-Force has not seen evidence of attacks carried out using this method, but we developed the proof of concept to warn users of the potential dangers," John Kuhn, senior security threat researcher for IBM X-Force IRIS, told eWEEK. The IBM disclosure of the Siri Shortcuts risk comes during a particularly challenging week for Apple as the company struggles to deal with a critical FaceTime vulnerability that could enable an attacker to eavesdrop on an unsuspecting user. Unlike the FaceTime vulnerability, however, the Siri Shortcuts issue is not an explicit vulnerability in Apple's technology. "IBM X-Force conducted all of the research using native functionality of the Shortcuts app, so no exploitation of vulnerabilities was needed," Kuhn said. "We highly suggest that every user reviews Shortcuts before adding them to their devices." Kuhn added that IBM worked with Apple since the initial research discovery to share all the details. How It Works Siri Shortcuts provides powerful capabilities to users and developers. IBM's concern is that a hacker could abuse that power and trick a user with scareware. There is also the potential, according to IBM, for a Siri Shortcut to be configured to spread to other devices by messaging everyone on the victim’s contact list, expanding the impact of an attack. "Siri Shortcuts gives native capability to potentially send messages to contacts if the appropriate permissions are enabled," Kuhn said. "In theory, this could be manipulated by an attacker to spread a link to other contacts." There are, however, several caveats before a Siri Shortcut attack can spread. Kuhn noted that such an attack would require each user to install and run the Shortcut, which is more reminiscent of malware that uses email to propagate. The Siri Shortcut risk is also not a "drive-by" risk—that is, it isn't something that a user can get simply by visiting a malicious site. The user must install the Siri Shortcuts app as well as the malicious shortcut, he said. However, he noted that attackers could easily entice users to do so by socially engineering the intended victim. "This tactic is commonly used by attackers to get victims to install malware via email phishing attempts," Kuhn said. "Basically, the attacker needs to offer anything enticing enough to get the user to comply with installing an otherwise suspect piece of software." In terms of what data Siri Shortcuts is able to access and then send to an attacker, there are limits in place by default. "Siri Shortcuts does allow access to some system files on the phone. However, it does not allow access files with PII [personally identifiable information] as far as our research has determined," Kuhn said. "Siri Shortcuts does have native functionality to give the victim's physical address, IP address, photos, videos and more." So what should Apple users do? IBM suggests that users be careful when downloading third-party Siri Shortcuts and only install from a trusted source. IBM also suggests that users be mindful when running a Siri Shortcut and only enable actions that are needed. Source
  16. Meet IBM's bleeding edge of quantum computing With the Q System One, the tech titan's grand promise of super-powerful computing takes a big step forward. The Q System One model at the CES 2019 tech show. Sarah Tew/CNET The IBM Q System One model doesn't look like a computer. It looks like a conceptual art series of plates being held together with fishing lines suspended from a ceiling. The whole contraption is encased in half-inch-thick glass created by Milan-based Goppion, which made the protective displays for the Mona Lisa and the Crown Jewels. Bob Sutor, an IBM veteran who leads the Q System One team, directed me to look at the bottom of this quantum computer -- an experimental machine with potentially massive computing power -- where there was a tiny silver rectangle in the middle of a tangle of golden wires. That's the home of the machine's quantum bits, or qubits, which are tiny, fragile particles that make the whole system work. I asked him how much such a computer costs. He declined to say, adding: "It's not lunch." We were standing in the middle of the Las Vegas Convention Center during the CEStech show earlier this month. A jostling crowd around us angled to snag pictures of the model. IBM was at the show to publicly present this replica of the Q System One, its first quantum computer that fits into one neat package. Past designs were more like "backroom experiments," Sutor said, with jumbles of components strewn about a room. The real Q System One was completed in November and is in IBM's Yorktown Heights, New York, offices. The machine represents a big step toward quantum computingbecoming a commercial reality, after IBM has toiled for decades with the computing concept. Inside a quantum computer is one of the coldest places on Earth. Bob Sutor, IBM Creating a fully functional system makes quantum computers more reliable and easier to upgrade. Beyond those practical uses, these computers have the potential to create more effective antibiotics, help scientists better understand chemistry and nature and improve power grids. The machines could do that by providing businesses and scientists the ability to crunch extremely complex calculations that can't be digested by classical computers. But beyond that hype, there's years more work to do to prove quantum computers are up to the task. Also, it's possible a different type of computer will lead to the next breakthroughs, instead of quantum designs. "That's a big step, but it's one step in a journey that's 1,000 miles long." Brian Hopkins, a Forrester analyst focused on quantum computers, said of the new Q System One. Super cold computing In a classical computer, data is crunched by processing bits, designated as either 0 or 1. In quantum computing, qubits are used instead. These qubits have more complex properties that allow them to become combinations of 0 and 1 at the same time and also to interact with each other. Bob Sutor standing by the quantum computer model. Sarah Tew/CNET With each additional qubit that's added, the amount of information a quantum computer can hold doubles. That capability may help a quantum computer become a far more powerful way to process certain kinds of problems that classical computers can't handle. Using these qubits could help scientists unlock ways of developing new medicines at the molecular level or creating stronger security codes or processing the mountains of data being created at CERN's Large Hadron Collider. The Q System One currently uses 20 qubits. "By the time you get up to around 280 [qubits], that number -- two to the 280th power -- is approximately the number of atoms in the observable universe," Sutor said, offering a hint at just how powerful these computers may someday become. Seeing the potential of these computers, startups such as Rigetti and D-Wave, and the research arms of Microsoft, Intel and Google are developing quantum computing, too. IBM has also partnered with ExxonMobil, Daimler, Samsung, Barclays and major corporations to kick the tires on what's possible with its quantum computers. But using quantum computers is an excruciatingly delicate task. The Q System One's thick glass housing is used to cut down on vibrations and radiation, and helps keep the computer at near absolute zero. Inside the real computer in New York, quick blasts of super-cold air are used to keep the qubits inside at 10 millikelvins, colder than outer space. "So inside a quantum computer is one of the coldest places on Earth," said Sutor, 60, whose 6-foot-4 frame, graying beard, deep voice and cheery disposition give him the air of an IBM Santa Claus. That extreme cold and thick glass are needed to protect the qubits inside the machine, which are so fragile that a single photon of light or a rap of someone's knuckles could destroy their computation, Sutor said. Because these machines are so delicate, any future quantum computing will likely be done over the internet to allow IBM to carefully maintain the machines at its own facilities. A long way to go To be sure, the promise of quantum computers remains just that -- promise, and not yet reality. "Quantum computers are not a magical solution for all problems that classical computers can't solve," Forrester's Hopkins said. "They are a potential solution for some of the problems that classical computers can't solve." He added that the tech industry today is in the middle of discovering what quantum computers can do. Answering those questions will take a few more years, and achieving the ultimate promise of quantum computers could take a decade or two, Hopkins said. But thanks to the new Q System One, researchers and the general public now have a notable milestone by which to judge the advance of quantum computing. That system will help cut down on upgrade times for these machines to hours or days, instead of days or weeks. It should also make it easier for IBM to build more of these machines to support a future quantum computing business. "We set out to build something which was highly functional, but beautiful," Sutor said, "and would give us a way to look at what we were doing in the future." Sutor wasn't under any misconceptions that his work is nearly finished. When I asked him what the next steps are for his project, he said: "What do we have to do? Everything." Source
  17. IBM sets forth with a strong cybersecurity message IBM has a strong cybersecurity message, but there's a gap between IBM security and its corporate vision. If IBM can bridge this gap, it can carve out a unique market position. Stephen Lawson/IDG I just got back from attending IBM Think in San Francisco. Though it was a quick trip across the country, I was inundated with IBM’s vision, covering topics from A (i.e. artificial intelligence) to Z (i.e. System Z) and everything in between. Despite the wide-ranging discussion, IBM’s main focus was on three areas: 1) hybrid cloud, 2) advanced analytics, and 3) security. For example, IBM’s hybrid cloud discussion centered on digital transformation and leaned heavily on its Red Hat acquisition, while advanced analytics included artificial intelligence (AI), cognitive computing (Watson), neural networks, etc. To demonstrate its capabilities in these areas, IBM paraded out customers such as Geico, Hyundai Credit Corporation, and Santander Bank, who are betting on IBM for game-changing digital transformation projects. [ Keep up with 8 hot cyber security trends (and 4 going cold). | Sign up for CSO newsletters. ] IBM's cybersecurity plans As for cybersecurity, here are a few of my take-aways about IBM's plans: Not surprisingly, IBM is all-in on cybersecurity services, which now account for more than 50 percent of its cybersecurity revenue. According to ESG research (and lots of other industry sources), cybersecurity services growth will continue to outpace products due to the global cybersecurity skills shortage. (Note: I am an employee of ESG.) IBM is banking on this trend by adding staff, investing in backend systems and processes, and rolling out new service offerings. For example, IBM is working with partners on a managed services program where local partners benefit from IBM’s global resources, analytics, and threat intelligence. Overall, IBM has a unique opportunity to separate itself from the pack and could become the de facto enterprise cybersecurity services leader. Most cybersecurity professionals think of IBM QRadar as a SIEM, competing with the likes of ArcSight, LogRhythm, and Splunk. While this perspective is true, it minimizes its value. QRadar is really a security operations and analytics platform architecture (SOAPA). Customers can use QRadar as a security operations nexus, adding functionality such as network traffic analysis (NTA), vulnerability management (VM), and user behavior analytics (UBA) to the core system. What’s more, QRadar offers several helper applications, such as DNS analytics, most of which are free. Finally, QRadar has thousands of customers around the world. IBM has some work ahead here – it needs to gain cybersecurity street cred by marketing QRadar as a SOAPA offering and global cybersecurity community, rather than a plain old SIEM. IBM is embracing security “from the cloud.” For example, QRadar on cloud (QROC) revenue grew over 20 percent, demonstrating that customers want the value of QRadar without the infrastructure baggage of on-premises collectors, databases, servers, etc. IBM is also poised to roll out its IBM Security Connected (ICS) platform in Q2. In keeping with its minimalist communications, IBM hasn’t trumpeted the ICS initiative, but in my humble opinion, it represents a major change in direction. For ICS, IBM rewrote its security applications as microservices to build a foundation of cloud integration and scale. Thus, ICS applications will grow from discrete SaaS offerings to an integrated cloud-scale cybersecurity architecture over time. Oh, and ICS will come with lots of services options for everything from staff augmentation to outsourcing. ICS has the potential to be a big deal for overwhelmed CISOs with global responsibilities and the need for massive cybersecurity scale. Resilient is an enterprise-class security operations platform. When IBM acquired Resilient Systems a few years ago, it gained a technology leader but sort of ceded the SOAR buzz to other vendors. This is a shame. Resilient may require a bit more work than some of its competitors, but I find that customers are using Resilient to re-architect their security operations processes and establish real and measurable security operations metrics. To me, this is where security operations platforms must go – beyond quick automation and orchestration wins to anchoring security process re-engineering. 4 ways IBM can improve its cybersecurity game IBM’s security portfolio is pretty solid, and the company seems to be more energized than in the past. After attending IBM Think, I do have a few cybersecurity recommendations for folks in Armonk and Cambridge, Massachusetts: While IBM Think has a strong hybrid cloud theme, the IBM security hybrid cloud story remains disjointed – an identity story here, a data security story there, etc. This leads to IBM being outflanked by cloud-savvy security startups. IBM needs a cohesive, tightly integrated product offering and messaging framework here. IBM’s risk management services are solid but somewhat hidden. According to recent ESG research, there is a growing cyber risk management gap between what business executives need and what cybersecurity professionals can deliver. Given its industry knowledge and relationships, IBM should be doing more in the cyber risk management space – at the product and services level. Closely related to #2, cybersecurity is truly a boardroom-level issue – especially for traditional IBM customers. I find that there is a disconnect between IBM’s corporate focus on digital transformation, industry solutions, and hybrid clouds and its cybersecurity go-to-market, which remains centered within the bits and bytes. Again, IBM is in a unique position to figure out a more top-down approach (i.e. from the business down to the technology) and deliver business-centric cybersecurity solutions to customers. IBM spent millions of dollars on a Watson for a cybersecurity advertising campaign, but few cybersecurity professionals have a clue about what Watson for cybersecurity is. The suits in Armonk should pump the advertising brakes and dedicate more toward market education by working with professional organizations such as ISSA, ISC2, SANS, the Infosec Institute, etc. In general, Armonk must understand that the IBM brand is a marketing obstacle when competing for mindshare with vendors like CrowdStrike, FireEye, Palo Alto Networks, etc. Thus, IBM security must work harder and smarter to get the word out. Many thanks to IBM for hosting me in San Francisco this week. I’ll be back at the Moscone Center for RSA in the blink of an eye. Source
  18. Severe Java bugs found in IBM Watson and its components A total of five vulnerabilities affected several components of IBM Watson. One of the critical bugs (CVE-2018-2633) can allow attackers to remotely control Watson systems. Watson, IBM’s trademark artificial intelligence(AI) system, was found to be riddled with critical security vulnerabilities in its platform. The bugs were identified in the IBM Runtime Environment Java Technology Edition, which is used by Watson Explorer and Content Analytics. IBM has addressed the five vulnerabilities by providing a fix to all the affected components. The big picture The Java components with vulnerabilities were JRockit Libraries, JRockit LDAP, JRockit JNDI, and I18n. These flaws could enable attackers to steal sensitive information, conduct denial of service attacks and have control over the infected systems. They are designated as CVE-2018-2579, CVE-2018-2588, CVE-2018-2602, CVE-2018-2603, and CVE-2018-2633. CVE-2018-2633 was the most severe among the identified vulnerabilities, which would allow cybercriminals to completely take over Watson. “An unspecified vulnerability in Oracle Java SE related to the Java SE, Java SE Embedded, JRockit JNDI component could allow an unauthenticated attacker to take control of the system.” described the bulletin. Altogether, 18 IBM Watson products were discovered to be affected. Update published Following the disclosure of the security flaws, IBM released updates for the affected components. Users are advised to upgrade to the required version of IBM Java Runtime to remediate the five vulnerabilities. All these flaws were actually addressed in the Oracle January 2018 advisory but still impacted IBM Watson due to lack of a fix until now. Regarding the affected products, Watson Explorer Foundational Components and Watson Explorer Analytical Components versions formed the major chunk. Source
  19. Symantec partners with IBM, Microsoft and others to cut cyber security cost San Francisco: California-headquartered global cybersecurity company Symantec said it had forged partnerships with 120 companies including Amazon Web Services (AWS), IBMSecurity, Microsoft and Oracle among others to drive down the cost and complexity of cyber security. The enterprise partners are now building or delivering more than 250 products and services that integrate with Symantec's Integrated Cyber Defense (ICD) Platform, the company said on Wednesday. Symantec's ICD Platform provides a unified framework for information protection, threat protection, identity management and compliance across endpoints, networks, applications and clouds. "There's a seismic shift happening in cyber security," Art Gilliland, Executive Vice President and General Manager, Enterprise Products, Symantec, said in a statement. "The old way of fighting cyber-attacks using fragmented tools has become too complex and expensive to manage. Integrated platforms are the future," Gilliland added. Symantec started building ICD two and a half years ago with its acquisition of Blue Coat Systems, which added web and Cloud security technologies to Symantec's endpoint, email and data loss prevention (DLP) technologies. Source
  20. IBM sends Blockchain World Wire for global payments into limited production Big Blue's latest blockchain play sees cross-border payments being sent via digital tokens in near real-time. How IBM Blockchain World Wire works (Image: IBM) IBM has announced that its blockchain-based global payments network has been sent into limited production, with the company touting it as the "new financial rail" that clears and settles cross-border payments in near real-time. IBM Blockchain World Wire, Big Blue claims, is the first blockchain-based network that integrates payment messaging, clearing, and settlement on a single network. "The concept of money is 2,000 years old. The world has been using the same network to process financial transactions for 50 years. And even though globalisation has changed the world, payment fees and other financial barriers remain the same. But now there's a new way to move money," the company pitches. "We've created a new type of payment network designed to accelerate remittances and transform cross-border payments to facilitate the movement of money in countries that need it most," IBM Blockchain general manager Marie Wieck added. "By creating a network where financial institutions support multiple digital assets, we expect to spur innovation and improve financial inclusion worldwide." World Wire uses the Stellar protocol -- an open-source, decentralised protocol for digital currency to fiat currency transfer -- to transmit monetary value in the form of digital currency. The blockchain-based network will support settlement using Stellar Lumens (XLM) and the US dollar stable coin through IBM's existing partnership with Stronghold. IBM said that pending regulatory approvals and other reviews, six international banks, including Banco Bradesco, Bank Busan, and Rizal Commercial Banking Corporation (RCBC), have signed letters of intent to issue their own stable coins on World Wire. If successful, this will see the addition of the euro, Korean won, Brazilian real, Indonesian rupiah, and Philippine peso stable coins to the network. According to IBM, World Wire has enabled payment locations in 72 countries, with 47 currencies and 44 banking endpoints. "Local regulations will continue to guide activation, and IBM is actively growing the network with additional financial institutions globally," the company said. Source
  21. Managed services and software optimized for Red Hat OpenShift and Linux aimed at helping enterprises move to the cloud. Image: IBM CEO Ginni Rometty with Red Hat CEO Jim Whitehurst It's only been three weeks since IBM closed its $34 billion takeover of Red Hat, and that was as long as the company was willing to wait until it announced its first joint products with the new subsidiary. According to IBM, it has already "transformed its software portfolio to be cloud-native and optimized it to run on Red Hat OpenShift." The new Cloud Paks are containerized software, specialized by workload and optimized to run on Red Hat's implementation of the open source container application platform OpenShift. They are meant to help enterprises move to the cloud. IBM also announced Red Hat OpenShift on IBM Cloud as a fully managed service and Red Hat OpenShift on IBM Z and LinuxONE for its mainframe customers. In addition, it's offering consulting and technology services for Red Hat, utilizing what it says is "one of the world's largest teams of Red Hat-certified consultants and more than 80,000 cloud application services practitioners" to help its customers move to cloud environments and maintain their cloud infrastructures once the move is made. "Red Hat is unlocking innovation with Linux-based technologies, including containers and Kubernetes, which have become the fundamental building blocks of hybrid cloud environments," Red Hat CEO Jim Whitehurst said in a statement. "This open hybrid cloud foundation is what enables the vision of any app, anywhere, anytime. Combined with IBM's strong industry expertise and supported by a vast ecosystem of passionate developers and partners, customers can create modern apps with the technologies of their choice and the flexibility to deploy in the best environment for the app -- whether that is on-premises or across multiple public clouds." The first five Cloud Paks out of the gate are: Cloud Pak for Data, which the company says will simplify and automate deriving insights from data while providing an open and extensible architecture to virtualize data for AI faster. Cloud Pak for Applications to help businesses modernize, build, deploy, and run applications. Cloud Pak for Integration of apps, data, cloud services, and APIs. Cloud Pak for Automation to help transform business processes, decisions, and content. Cloud Pak for Multicloud Management to provide multicloud visibility, governance, and automation. According to IBM, the Cloud Paks provide a common operating model and common set of services with a unified and intuitive dashboard. "IBM is unleashing its software from the data center to fuel the enterprise workload race to the cloud," Arvind Krishna, IBM's senior VP of cloud and cognitive software, said in a statement. "This will further position IBM the industry leader in the more than one-trillion-dollar hybrid cloud opportunity. We are providing the essential tools enterprises need to make their multi-year journey to cloud on common, open standards that can reach across clouds, across applications and across vendors with Red Hat." All in all, the company says the new software and services draw on more than 100 products from IBM's software portfolio that are optimized for Red Hat OpenShift and Red Hat Enterprise Linux. Source
  22. The system will go online in October. IBM's 14th quantum computer is its most powerful so far, a model with 53 of the qubits that form the fundamental data-processing element at the heart of the system. The system, available online to quantum computing customers in October, is a big step up from the last IBM Q machine with 20 qubits and should help advance the marriage of classical computers with the crazy realm of quantum physics. Quantum computing remains a highly experimental field, limited by the difficult physics of the ultra-small and by the need to keep the machines refrigerated to within a hair's breadth of absolute zero to keep outside disturbances from ruining any calculations. But if engineers and scientists can continue the progress, quantum computers could help solve computing problems that are, in practice, impossible on today's classical computers. That includes things like simulating the complexities of real-world molecules used in medical drugs and materials science, optimizing financial investment performance, and delivering packages with a minimum of time and fuel. Quantum computers rely on qubits to store and process data. Unlike regular computer bits, which can store either a zero or a one, qubits can store a combination of both through a concept called superposition. Another factor is entanglement, which links the states of two qubits even if they're separated. "The new quantum system is important because it offers a larger lattice and gives users the ability to run even more complex entanglement and connectivity experiments," said Dario Gil, director of IBM Research. IBM is competing with companies like Google, Microsoft, Honeywell, Rigetti Computing, IonQ, Intel and NTT in the race to make useful quantum computers. Another company, D-Wave, uses a different approach called annealing that's already got some customers, while AT&T and others are pursuing the even more distant realm of quantum networking. If you're used to classical computers, you'll be familiar with powers of 2 that crop up all over the place: 1, 2, 4, 8, 16, 32, 64, 128, 256, 512, etc. So what's up with a quantum computer with 53 qubits? It stems from the hexagonally derived lattice of qubits that's advantageous when it comes to minimizing unwanted interactions, IBM said. IBM is pushing a concept called quantum volume to measure quantum computer performance. It's designed to capture more aspects of quantum computing than just qubits, which can be misleading since other factors can degrade qubit performance. IBM's 20-qubit quantum computers, of which there are now five, have a quantum volume of 16, but IBM hasn't yet tested the 53-qubit model. Source
  23. IBM makes the Power Series chips, and as part of that has open-sourced some of the underlying technologies to encourage wider use of these chips. The open-source pieces have been part of the OpenPower Foundation. Today, the company announced it was moving the foundation under The Linux Foundation, and while it was at it, announced it was open-sourcing several other important bits. Ken King, general manager for OpenPower at IBM, says that at this point in his organization’s evolution, they wanted to move it under the auspices of the Linux Foundation . “We are taking the OpenPower Foundation, and we are putting it as an entity or project underneath The Linux Foundation with the mindset that we are now bringing more of an open governance approach and open governance principles to the foundation,” King told TechCrunch. But IBM didn’t stop there. It also announced that it was open-sourcing some of the technical underpinnings of the Power Series chip to make it easier for developers and engineers to build on top of the technology. Perhaps most importantly, the company is open-sourcing the Power Instruction Set Architecture (ISA). These are “the definitions developers use for ensuring hardware and software work together on Power,” the company explained. King sees open-sourcing this technology as an important step for a number of reasons around licensing and governance. “The first thing is that we are taking the ability to be able to implement what we’re licensing, the ISA instruction set architecture, for others to be able to implement on top of that instruction set royalty free with patent rights,” he explained. The company is also putting this under an open governance workgroup at the OpenPower Foundation. This matters to open-source community members because it provides a layer of transparency that might otherwise be lacking. What that means in practice is that any changes will be subject to a majority vote, so long as the changes meet compatibility requirements, King said. Jim Zemlin, executive director at the Linux Foundation, says that making all of this part of the Linux Foundation open-source community could drive more innovation. “Instead of a very, very long cycle of building an application and working separately with hardware and chip designers, because all of this is open, you’re able to quickly build your application, prototype it with hardware folks, and then work with a service provider or a company like IBM to take it to market. So there’s not tons of layers in between the actual innovation and value captured by industry in that cycle,” Zemlin explained. In addition, IBM made several other announcements around open-sourcing other Power Chip technologies designed to help developers and engineers customize and control their implementations of Power chip technology. “IBM will also contribute multiple other technologies including a softcore implementation of the Power ISA, as well as reference designs for the architecture-agnostic Open Coherent Accelerator Processor Interface (OpenCAPI) and the Open Memory Interface (OMI). The OpenCAPI and OMI technologies help maximize memory bandwidth between processors and attached devices, critical to overcoming performance bottlenecks for emerging workloads like AI,” the company said in a statement. The softcore implementation of the Power ISA, in particular, should give developers more control and even enable them to build their own instruction sets, Hugh Blemings, executive director of the OpenPower Foundation explained. “They can now actually try crafting their own instruction sets, and try out new ways of the accelerated data processes and so forth at a lower level than previously possible,” he said. The company is announcing all of this today at the The Linux Foundation Open Source Summit and OpenPower Summit in San Diego. Source
  24. Autonomous ‘Mayflower’ research ship will use IBM AI tech to cross the Atlantic in 2020 A fully autonomous ship called the ‘Mayflower’ will make its voyage across the Atlantic Ocean next September, to mark the 400-year anniversary of the trip of the first Mayflower, which was very much not autonomous. It’s a stark way to dive home just how much technology has advanced in the last four centuries, but also a key demonstration of autonomous seafaring technology, put together by marine research and exploration organization ProMare and powered by IBM technology. The autonomous Mayflower will be decked out with solar panels, as well as diesel and wind turbines to provide it with its propulsion power, as it attempts the 3,220 mile journey from Plymouth in England, to Plymouth in Massachusetts in the U.S. The trip, if successful, will be among the first for full-size seafaring vessels navigation the Atlantic on their own, which ProMare is opening will open the doors to other research-focused applications of autonomous seagoing ships. To support that use case, it’ll have research pods on board while it makes its trip. Three to be specific, developed by academics and researchers at the University of Plymouth, who will aim to run experiments in areas including maritime cybersecurity, sea mammal monitoring and even addressing the challenges of ocean-borne microplastics. IBM has provided technical support for both the research and the navigation aspects of the mission, including providing its PowerAI vision technology backed by its Power Systems severs, which will work with deep leearing models that it’s developing in partnership with ProMare to help with avoidance of obstacles and hazards at sea. It’ll use RADAR, as well as LIDAR and optical cameras to accomplish all this. The system is designed to use both local and remote processing, meaning devices on the ship will be able to operate without connection at the edge, and then check back in periodically with HQ when conditions allow for processing via nodes located at either shore. This is a super cool project that could change the way we research the ocean, deep lakes and other aquatic environments. The plan is to also build VR and AR tools for essentially ‘boarding’ the autonomous Mayflower while it’s on its trip, so stay tuned for more on how you can get a closer look at the project as it prepares for its run next year. Source: Autonomous ‘Mayflower’ research ship will use IBM AI tech to cross the Atlantic in 2020
  25. IBM has filed a patent infringement lawsuit against Zillow, claiming that the real estate company used key parts of IBM technology on its website, according to reports. IBM said the parts of Zillow’s site that estimate home values, as well as the way it does searches, are part of IBM’s patented technology. The computer giant has sued a number of other well-known companies in patent disputes as well. There are seven patents named in the lawsuit, including one about Zillow using computing power to figure out how desirable a geographic area is to customers, and another one about searches fitting inside of a screen. One of Zillow’s features, called Zestimate, gives estimates on how much a home is worth, and the search function is heavily reliant on maps. The lawsuit was filed in a California federal court, and IBM claims that it has been trying to get Zillow to agree to a patent licensing deal for the past three years. IBM said it turned to the courts because of Zillow’s refusal to play ball. “Because IBM’s over three-year struggle to negotiate a license agreement that remedies Zillow’s unlawful conduct has failed, IBM has been forced to seek relief through litigation,” the lawsuit said. “Among other relief sought, IBM seeks royalties on the billions of dollars in revenue that Zillow has received based on their infringement of IBM’s patented technology.” IBM wants to ban Zillow from using any of the seven patents in the future, and it wants damages from the alleged infringement of the patents. “We are aware of the lawsuit filed in federal court. We believe the claims in the case are without merit and we intend to vigorously defend ourselves against the lawsuit,” Zillow said. IBM said it spends a great deal of money on research and development and takes patent infringement very seriously. “IBM invests more than $5 billion annually in research and development, and relies on its patents to protect that investment,” IBM said in a statement. “We are acting to address Zillow’s unauthorized use and infringement of IBM intellectual property after years of trying to negotiate a fair agreement.” Source
  • Create New...