martes, 1 de abril de 2014

Earn heading to Oracle?: SQL Server 2014, First Take: Powerful and flexible, with added in-memory support

Fuente: ZDNET.com

SQL Server 2014, First Take: Powerful and flexible, with added in-memory support

Summary: A new version of Microsoft's database has been released, with enhanced in-memory support and Azure-hosted backup.

Microsoft has been running SQL Server 2014 through its CTP programme for a while, and it's now time for the company's latest database release to reach general availability. We recently spent some time at Microsoft's Redmond campus trying out its key new features.

SQL Server 2014 is, at heart, very much the familiar SQL Server. It uses the same familiar management tools, the same T-SQL language, and the same APIs that connect it to your applications. That means you should be able to upgrade existing databases in place, to take advantage of its performance and scaling improvements. But that's only part of the story, as Microsoft has been looking at the ways we use data in modern applications, and added new features that should dramatically improve performance — and that also bring on-premises databases and the cloud closer together.
sql-server-2014-admintools
SQL Server 2014's Management Studio looks much like its predecessors, so you can get started managing databases as soon as you've installed the database and tools. Image: Microsoft

In-memory support: Hekaton

The biggest change is the launch of Hekaton, SQL Server's new in-memory OLTP and data warehousing tools. It's an important new feature because it adds in-memory support out the box, focusing purely on performance. You don't need to put your whole database in-memory, either — just the tables that will get a performance boost. Smaller databases built using SQL Server 2014 Standard won't get access to these new features: they're only part of the Enterprise edition.
sql-server-2014-inmemorytable
You can get quite a performance boost from in-memory tables in SQL Server 2014 — this sample app gets a 14x speed-up. Image: Microsoft
In-memory OLTP makes a lot of sense. It's not a problem that can be solved by throwing parallel cores at a database — and as CPU performance is static, you can take advantage of lower memory costs by shifting to in-memory operations. The Hekaton engine has been designed from scratch to work with modern memory, and is a lot more than just a cache, building on techniques developed for Azure. There's a new query engine for in-memory operations, which doesn't use locks; instead it uses independent threads with low-level interlocks to ensure data integrity. Where a traditional lock can take thousands of CPU cycles, a Hekaton interlock takes just 10 or 20.

SQL Server 2014 provides an analysis, migration and reporting (AMR) tool. In the SQL Server Management Studio, right-click on a table and choose Memory Optimisation Advisor to check the table and validate whether it can be converted to an in-memory table. The process builds the appropriate filegroup and copies the data to an in-memory table — and you should end up with a 20x to 40x speed-up.
sql-server-2014-storedprocedureinmemory
You can use built-in diagnostic tools to determine what elements of a database will benefit from a switch to in-memory, including stored procedures. Image: Microsoft
sql-server-2014-columnstore
Convert large tables to column stores to speed up access — and to make them easier to manage. Image: Microsoft
Massive databases, with millions of rows, can take advantage of the new Clustered Column Store Index, which improves compression, reduces I/O and fits more data in memory. Staged data is moved into a column store, from where it can be accessed more quickly.

The Azure connection


Microsoft's Azure cloud platform mixes its own SQL Azure database service with SQL Server running on virtual machines as part of its IaaS (Infrastructure-as-a-Service) offering. Although SQL Server 2014 is still at heart an application, rather than a service, it's been designed to take advantage of the cloud, using Azure's storage and IaaS capabilities to give businesses of all sizes access to cloud-hosted disaster recovery.

Large databases can mean expensive, and often slow, backups. Using Azure as a subscription-based backup, there's no need for CAPEX, and you can use your existing backup techniques — just with Azure as a target. It's arguably more secure than a traditional backup: Azure holds three copies of your data, so it's always available. Getting started can take time, so Azure offers the option of letting you make your initial backup on a local disk, that's then mailed to Microsoft and stored in Azure, ready for the rest of your backups over the wire. Backups can be encrypted, and there's even support for older versions of SQL Server.

Managed backup tools automate the process. All you need to do is define the Azure account you're using and a retention period. SQL Server will then backup logs every 5MB, every day, or 1GB. If you accidently delete a log backup, the system will detect that you no longer have a consistent backup chain, and will take a full backup.
Although SQL Server 2014 is still at heart an application, rather than a service, it's been designed to take advantage of the cloud, using Azure's storage and IaaS capabilities to give businesses of all sizes access to cloud-hosted disaster recovery.
Azure and SQL Server can also be used as a disaster recovery (DR) solution, with an Azure IaaS SQL Server designated as an always-on replica. As soon as an on-premises server fails, you're switched to a cloud-hosted SQL Server instance preloaded with the last backup. It's not the cheapest approach, but it does mean you don't need to invest in running your own DR site. You can use any Azure region, and all you need to pay for is the IaaS VM and the storage you need. The backup tools validate the environment, and handle failures.

One cheaper option is to use SQL Server's Azure cloud backup as the basis of a cold-start DR service. Hosting a suspended SQL Server instance on Azure IaaS (which only costs you when your server runs), you can use your cloud backup data to update the databases associated with your cloud DR server, bringing you back online after a failure. It's not as fast as a failover onto an always-running DR server, but it's an economical approach that will work well for smaller businesses.

With hybrid cloud scenarios in mind, there's also tooling that will migrate a SQL Server database from an on-premises server to a virtual machine running on Azure. It's not just for SQL Server 2014, either, as the wizard will migrate SQL Server 2008, 2008 R2 and 2012, with support for VMs running SQL Server 2012 and 2014. It's an approach that makes it easier to handle database migrations, or to use Azure as a development platform for new applications — or, of course, to move from on-premises to cloud.

Deploying SQL Server 2014 in Azure is simplified by Microsoft providing VM images with SQL Server already installed. All you need to do is pick the image you want, deploy it, and you're ready to go. Once it's instantiated you can open SQL Server 2014's Management Studio, and use the Deploy Database to Windows Azure VM option to launch the wizard. Connect to the remote server, sign in to Azure, publish to a database in your VM, and (once the data has uploaded) away you go.

Conclusion

Microsoft's latest SQL Server is a product of a new way of working in the company's server business. Rather than a big-bang release, it's more of an incremental improvement with support for new ways of working and for new ways of securing your data. You don't need to take advantage of all its new features straight away; you can add them to your applications and management processes as and when you want to use them. The result is a powerful and flexible database that can get the most from modern hardware, and at the same time give you a route to delivering on the hybrid cloud promise, working with local and cloud data using familiar tools.
Simon Bisson

About Simon Bisson

Simon Bisson is a freelance technology journalist. He specialises in architecture and enterprise IT. He ran one of the UK's first national ISPs and moved to writing around the time of the collapse of the first dotcom boom. He still writes code.

Todos los Sábados a las 8:00PM

Optimismo para una vida Mejor

Optimismo para una vida Mejor
Noticias buenas que comentar