jueves, 25 de enero de 2018
AT&T Migrates Massive Databases to Oracle Cloud in Historic Partnership
By Mike Faden
AT&T, like other large companies, is moving to the cloud to ensure it can continue to innovate and stay ahead in a fast-moving industry. The world’s largest communications company by revenue has led the telecom and pay-TV industries in virtualizing and software-controlling its global wide area network, providing a foundation for introducing new services and responding to customer needs more quickly.
The company operates a private cloud based on proprietary virtualization to support its software-defined network services. And it has moved thousands of its smaller Oracle databases to another virtualized private cloud designed for more general-purpose use.
Until recently, AT&T has lacked a cloud-based solution to run its roughly 2,000 largest mission-critical Oracle databases—those greater than 8 TB in size. Though AT&T’s general-purpose private cloud provides an agile, automated IT environment, it cannot provide the required performance for these very large, transaction-intensive databases, many of which contain customer data and must remain on premises for regulatory, privacy, and security reasons.
AT&T selected Oracle Cloud at Customer—which provides the same hardware and software platform that Oracle uses in its own cloud data centers and puts them into a “cloud machine” that lives in the customer’s data center—to run its largest mission-critical Oracle databases. Under an agreement described as “historic” by Oracle CEO Mark Hurd, AT&T and Oracle announced a strategic five-year partnership in May that includes moving AT&T’s large, high-performance databases to Oracle Cloud using Oracle Cloud at Customer. The agreement also includes global access to Oracle’s cloud portfolio, including Oracle Field Service Cloud Service, which AT&T will use to further optimize scheduling and dispatching for its more than 70,000 field technicians.
Oracle Cloud at Customer will enable AT&T to run mission-critical databases up to 100 TB in size in an Oracle-managed cloud that’s as flexible and scalable as a public cloud—but is located in AT&T facilities. These databases will be run on Oracle Database Exadata Cloud at Customer, which provides the most-scalable and most-reliable platform for running Oracle Database. “What is very intriguing about Oracle Cloud at Customer is that it offers all the benefits of a public cloud with the security and performance of a private cloud,” says Claude Garalde, AT&T lead principal technical architect. Application performance is also optimized because Oracle Cloud at Customer connects directly to AT&T’s data center network infrastructure, he adds. “For performance, you want the database to be really close to the application and middleware layers—you don’t necessarily want to be going out over a public internet link or even a VPN,” he says.
Moving the databases to Oracle Cloud will significantly increase business agility and automation. AT&T expects to halve the time required to implement big, complex databases, and it will be able to quickly increase capacity to meet demand peaks and reduce usage when demand recedes. “We want the solution to give us an elastic environment where we can scale up as the need arises and similarly scale down,” says Venkat Tekkalur, director of technology development at AT&T.
Dialing Up Databases
AT&T has more than 17,000 Oracle databases overall, storing a massive 19 PB of data. The company has been progressively migrating them to the cloud as part of a broad initiative that began more than five years ago; to date, it has moved about 5,000 of them to its general-purpose cloud. That cloud can support databases that are up to around 8 TB in size, Tekkalur says.
But until now, larger and more performance- intensive databases have still required a bare metal configuration. Although that approach delivered the required performance, AT&T faced challenges that are typical of those experienced by many large enterprises, Tekkalur says.
Factors such as the additional time required to order, deliver, and install hardware and software meant that it took roughly twice as long to implement a big database in a bare metal on-premises configuration compared to implementing databases in the cloud, he explains. The process was also more difficult to automate. As a result, it presented an obstacle to AT&T’s efforts to increase agility. “The mean time to implement was not aligned with the Agile methodology or the DevOps model,” Tekkalur says.
The approach also limited the ability to quickly scale to meet changes in business demand. “We often have to support major launches, such as new phones, with very little time to prepare,” Tekkalur adds.
In addition, the large databases and their supporting hardware were often dedicated to specific applications. That meant it was difficult to achieve savings by sharing infrastructure. “Once we brought in that hardware, there was no way to use it for anything else,” he says.
With Oracle Cloud at Customer, AT&T plans to solve those challenges, slashing implementation times for databases up to 100 TB while greatly increasing flexibility, with an elastic shared environment that facilitates scaling and allows resources to be easily reallocated based on demand. Oracle Database Exadata Cloud at Customer will provide the performance required for the large transaction-intensive databases. And because those databases will run at AT&T facilities behind the company’s firewall, they will also meet regulatory, privacy, and security needs.
Furthermore, AT&T is integrating Oracle Cloud at Customer so that to users, it looks and behaves just like part of AT&T’s overall cloud environment; from a single AT&T portal, users will be able to provision databases in Oracle Cloud at Customer, in AT&T’s general-purpose private cloud, or in public clouds. To achieve that integration, an abstraction layer below the portal will orchestrate a highly automated provisioning process across AT&T’s clouds using Oracle’s open cloud APIs to interface with Oracle Cloud.
To plan and implement the migration, AT&T is working closely with Oracle Consulting, which is providing a toolset to facilitate the migration process, including helping to size the required cloud database configuration and automate database provisioning. AT&T is also applying lessons gleaned from its private cloud experience to accelerate and automate the process, says Andy Ferretti, lead system engineer at AT&T. The net result is that AT&T expects to cut by 50 percent or even more the time it currently takes to complete the entire procurement and deployment process for big, complex databases. The time required to implement these large databases in Oracle Cloud at Customer will be similar to the time required to implement much smaller databases in AT&T’s private cloud today.
AT&T is also exploiting techniques learned in previous migrations to minimize downtime of these mission-critical databases as they move to Oracle Cloud at Customer, Ferretti says. After building a target database instance in Oracle Cloud, AT&T will take a snapshot of the source on-premises data and begin moving it to its new home in the cloud. During the time it takes to move the multiterabyte databases, AT&T will continue to capture the changes to the live on-premises database. Once the snapshot has been copied to the cloud, a synchronization method such as Oracle Active Data Guard or Oracle GoldenGate will be used to bring the target database up to date with the latest changes, so AT&T can quickly cut over to Oracle Cloud at Customer to support the live application. A reverse synchronization method will be put in place just in case there’s a need to revert to the original database.
After testing by early adopters in late 2017, the first Oracle Cloud at Customer databases are set to go live in early 2018, Ferretti says. Following that, the databases will progressively move to Oracle Cloud in phases. Ultimately, the plan is to implement Oracle Cloud at Customer at roughly 19 AT&T locations.
Low on Risk, High on ROI
Like AT&T, many other large enterprises are viewing Oracle Cloud at Customer as a way to solve data-protection as well as performance concerns as they migrate to the cloud, says Andrew Mendelsohn, executive vice president for database server technologies at Oracle. “For companies that have regulatory concerns or privacy concerns about customer data, this is a very low-risk way to go. Customers get all the agility and business model of the cloud, but they run in their own data center.”
The fact that the databases are in the customer’s data center and on the same network as the company’s business applications “eliminates the performance latency that you would have between an on-premises application and a database in the public cloud,” Mendelsohn adds. “And it is a stepping stone to a public cloud. If and when companies feel comfortable using the public cloud, it will be easy for them to move these databases.”
Garalde says that over the long term, AT&T is also looking to further increase agility by replacing big, monolithic applications with multiple microservices, each potentially with its own database, linked together via open APIs. This approach would allow AT&T to create and update new services more quickly by plugging together different combinations of microservices.
For AT&T, the cloud partnership with Oracle is a crucial step in its drive to deliver a seamless and intuitive experience for customers and to maintain industry leadership. “We believe that the future of the network is to be data-powered, to be software-centric, and to be fast and responsive,” says John Donovan, CEO of AT&T Communications. “This collaboration with Oracle accelerates our network transformation and migration to the cloud to expand efficiency, [increase] performance, and reduce cost while improving overall customer service.”
Mike Faden is a principal at Content Marketing Partners and is based in Portland, Oregon.