With the new engine OLTP SQL Server 2014 memory, you can load the tables and stored procedures in memory, which provides very fast response times. The goal is not to load all the database tables and stored procedures in memory, but only the tables that are critical to the performance and the stored procedures that have complex logical calculations.
To help you identify which tables and stored procedures give you the best performance gain after immigrating to In-Memory OLTP, SQL Server 2014 provides the new analysis, migration, and Report Tool (AMR). AMR construction tool in SQL Server Management Studio (SSMS), consists of:
Before you can use the AMR tool, you need to configure the Management Data Warehouse and the data collection process. After showing you how to do so, I'll demonstrate how to run the transaction performance analysis reports and how to use the two advisors.
After the welcome page, you will find in the Selection page of configuration tasks. On this page, select the option of setting up a data warehouse management.
In data storage warehouse Management Setup, you must specify the name of the database that hosts the data warehouse management, and the name of the server on which the database resides. If you need to create the database, click the new button to create one.
In logins maps page and users are allowed to access existing server that hosts the Data Warehouse Management. If necessary, you can change the user access or administrator, writer and reader functions for managing Data Warehouse.
On the Completing the Wizard page, verify that the configuration of the warehouse management. If this is correct, click Finish.
When the configuration of the Management Data Warehouse has successfully completed, you should see a page like that in Figure 1.
The Management Data Warehouse setup is now finished.
After the welcome page, you will find the Data Collection Setup page shown in Figure 2 besides having to specify the server and database that hosts the data warehouse management, you must specify the data collection sets. In the list of collection sets allow the collector to collect statistical data for performance problems transactions checkbox transaction set performances Collection.
If the data store management is in a different instance of SQL Server from the data collector and if the SQL Server Agent is not running under a domain account that has permissions dc_admin the remote instance, you must use a proxy to SQL Server Agent. If this is the case, make sure to select the Use a proxy server for remote loading checkbox SQL Server Agent check box.
Once you have finished configuring the Groups page, the installation of Data Collection, click Finish. When the wizard completes the configuration, you will have a data collection process allowed collect information on all user databases. Note that the SQL Server Agent must be started in the instance that will collect the data.
In the folder of the SQL Server Agent, you will see the processes used to collect data workloads and work that are used to load data into the data warehouse management. The work of data collection using the naming convention collection_set_N_collection, where N is a number. The workload using the naming convention collection_set_N_upload, where N is a number.
By default, the tool AMR three DMV collects data every 15 minutes for both the stored procedure using the Analysis and Use Table Collection Set Analysis. The charging process is performed every 30 minutes for stored procedures Using collection team analysis and then every 15 minutes for the whole collection of table analysis use. For faster loading, you can do this work manually. Loading data has minimal impact on performance.
Recommended Tables Based on Usage. This report tells you which tables are the best candidates for migration to In-Memory OLTP based on their usage. Figure 3 shows a sample report. On the left side, you can select the database and how many tables you'd like to see from that database. The chart will then show the selected tables. The horizontal axis represents the amount of work needed to migrate a table to In-Memory OLTP. The vertical axis represents the gains you'll achieve after migrating the table. The best candidates for In-Memory OLTP are located in the top right corner. As you can see, they can be easily migrated and will give you the best performance gain.
You can select the database and how many tables you'd like to see from that database. The resulting chart shows the amount of work needed to migrate the tables (horizontal axis) and the gains you'll achieve after migrating them (vertical axis). In the top right corner, you'll find the best candidates for migration based on contention. You can click a table name in the chart to access a detailed report showing the table's statistics. This report provides the table's access and contention statistics.
Recommended Stored Procedures Based on Usage. This report shows you which stored procedures are the top candidates for an In-Memory OLTP migration based on their usage (i.e., total CPU time). After selecting the database and how many stored procedures you'd like to see from that database, the resulting chart shows the top candidates for migration, as Figure 6 shows.
After you know which tables you want to migrate In - Memory OLTP, you can use the tool AMR Memory Optimization Advisor to help with the migration process. To access this advisory, open Object Explorer in SQL Server Management Studio and go to the table you want to migrate. Right-click the table and choose Memory Optimization Advisor.
The consultant will launch with the introduction page, which you can read or skip. When you click Next to go to the checklist optimization Migration page, where the consultant check to see if you can migrate to your table. If one or more elements of the validation fails, the migration process will stop. If necessary, you can generate a report for this analysis. If all you see are green markings, the table has no features that could prevent the migration process, in which case you can proceed to the next page.
Notes on the page for the optimization of migration, you will find important information on what is not supported on table’s optimized memory and other problems. The problems listed in the table may not prevent migration, but can cause other objects to fail or behave unexpectedly.
If a warning applies to the table you selected for migration, an exclamation point in a yellow triangle will appear next to the warning, as shown in Figure 8.
In this case, the selected table has an unsupported French_CI_AS collation on the indexed column named Person_OnDisk_Name. (Only BIN2 collations are supported for indexes in memory-optimized tables.) Thus, the index collation will need to be changed later in the migration process.
On the Review Optimization Options page, which Figure 9 shows, you have the option to change the defaults listed for the:
You can also choose to copy data from the original table to the new memory-optimized table during the migration process, and you can change the durability of the memory-optimized table. By default, its DURABILITY option will be set to schema_and_data, but you can change it to schema_only by selecting the Check this box to migrate this table to a memory-optimized table with no data durability option. If you do so, the data will be lost after the SQL Server service is restarted. In other words, just the table's schema is persistent.
Finally, the Review Optimization Options page shows the estimated current memory cost for the memory-optimized table. If there isn't sufficient memory, the migration process might fail.
Once you're done with the Review Optimization Options page, you can click Next to go to the Review Primary Key Conversion page. When the migration process begins, it will start by converting the primary key. You can convert it to:
For each index you have in the table being migrated, you'll be presented with a Review Index Conversion page that has been populated with the columns and data types for that index. The options you can configure in the Review Index Conversion page are similar to those in the Review Primary Key Conversion page. In this case, for the indexed column Person_OnDisk_Name with the unsupported French_CI_AS collation, you'd have to select BIN2 collation as the Char data type.
On the Verify Migration Actions page, you'll see all operations that will be performed to migrate your table to In-Memory OLTP. You have the option to script those operations by clicking the Script button. After verifying all the options, you can click the Migrate button to start the migration process.
Figure 10 shows how the new memory-optimized table appears in SSMS. If you view its properties, you'll see that the Memory optimized property is set to True and that the schema and data are durable for this table.
In Figure 10, you can also see how the original table has been renamed.
If the stored procedure has elements of T -SQL does not support,
validation will fail. To view details about items not supported, you must click
next to go to the results page validation stored procedure, which Figure 11 shows.
To help you identify which tables and stored procedures give you the best performance gain after immigrating to In-Memory OLTP, SQL Server 2014 provides the new analysis, migration, and Report Tool (AMR). AMR construction tool in SQL Server Management Studio (SSMS), consists of:
- Transaction performance collector (which collects data about existing tables and stored procedures in order to analyze workloads) and transaction performance analysis reports (which gives recommendations about the tables and stored procedures to migrate to In-Memory OLTP based on the collected data)
- Memory Optimization Advisor (which guides you through the process of migrating a table to a memory-optimized table)
- Native Compilation Advisor (which helps you identify T-SQL elements that need to be changed before migrating a stored procedure to a natively compiled stored procedure)
- Stored Procedure Usage Analysis collection set (which captures information about stored procedures for a future migration to natively compiled stored procedures)
- Table Usage Analysis collection set (which captures information about disk-based tables for a future migration to memory optimized tables)
Before you can use the AMR tool, you need to configure the Management Data Warehouse and the data collection process. After showing you how to do so, I'll demonstrate how to run the transaction performance analysis reports and how to use the two advisors.
Configuring the Management Data Warehouse
To configure the data warehouse management, go to Object Explorer in SQL Server Management Studio. Expand the Management folder, click Data Collection, select Tasks, and then click Configure Management Data Warehouse. Wizard configuration management data warehouse starts.After the welcome page, you will find in the Selection page of configuration tasks. On this page, select the option of setting up a data warehouse management.
In data storage warehouse Management Setup, you must specify the name of the database that hosts the data warehouse management, and the name of the server on which the database resides. If you need to create the database, click the new button to create one.
In logins maps page and users are allowed to access existing server that hosts the Data Warehouse Management. If necessary, you can change the user access or administrator, writer and reader functions for managing Data Warehouse.
On the Completing the Wizard page, verify that the configuration of the warehouse management. If this is correct, click Finish.
When the configuration of the Management Data Warehouse has successfully completed, you should see a page like that in Figure 1.
Configuring the Data Collection Process
To configure the process of data collection, go to Object Explorer in SQL Server Management Studio. Expand the Management folder, click Data Collection, point to Tasks, and then click Configure data collection. Setting up data collection wizard starts.After the welcome page, you will find the Data Collection Setup page shown in Figure 2 besides having to specify the server and database that hosts the data warehouse management, you must specify the data collection sets. In the list of collection sets allow the collector to collect statistical data for performance problems transactions checkbox transaction set performances Collection.
Once you have finished configuring the Groups page, the installation of Data Collection, click Finish. When the wizard completes the configuration, you will have a data collection process allowed collect information on all user databases. Note that the SQL Server Agent must be started in the instance that will collect the data.
In the folder of the SQL Server Agent, you will see the processes used to collect data workloads and work that are used to load data into the data warehouse management. The work of data collection using the naming convention collection_set_N_collection, where N is a number. The workload using the naming convention collection_set_N_upload, where N is a number.
By default, the tool AMR three DMV collects data every 15 minutes for both the stored procedure using the Analysis and Use Table Collection Set Analysis. The charging process is performed every 30 minutes for stored procedures Using collection team analysis and then every 15 minutes for the whole collection of table analysis use. For faster loading, you can do this work manually. Loading data has minimal impact on performance.
Running the Transaction Performance Analysis Reports
To access the recommendations on the basis of the information collected on all user databases on the workload of the server, you must run the transaction performance analysis report. To access it, click the Database Management Data Warehouse, click Reports, select Manage Data Warehouse, and then click Performance Analysis of transactions. From the Transaction Performance Analysis page, you can choose to run three reports, depending on the type of information required:- Recommended Tables Based on Usage
- Recommended Tables Based on Contention
- Recommended Stored Procedures Based on Usage
Recommended Tables Based on Usage. This report tells you which tables are the best candidates for migration to In-Memory OLTP based on their usage. Figure 3 shows a sample report. On the left side, you can select the database and how many tables you'd like to see from that database. The chart will then show the selected tables. The horizontal axis represents the amount of work needed to migrate a table to In-Memory OLTP. The vertical axis represents the gains you'll achieve after migrating the table. The best candidates for In-Memory OLTP are located in the top right corner. As you can see, they can be easily migrated and will give you the best performance gain.
You can access a detailed report for a table by clicking its name in the chart. As Figure 4 shows, this report provides the table's access statistics (e.g., lookups, range scan) and contention statistics (e.g., latches, locks), as well as when this information was captured.
Recommended Tables Based on Contention. This report tells you which tables are the best candidates for migration to In-Memory OLTP based on their contention. If you compare the contention analysis report in Figure 5 with the usage analysis report in Figure 3, you'll see that they're very similar.
Recommended Stored Procedures Based on Usage. This report shows you which stored procedures are the top candidates for an In-Memory OLTP migration based on their usage (i.e., total CPU time). After selecting the database and how many stored procedures you'd like to see from that database, the resulting chart shows the top candidates for migration, as Figure 6 shows.
If you want to see the detailed usage statistics for a specific stored procedure, you can click its blue bar. Figure 7 shows an example of the report you'll receive.
Using the Memory Optimization Advisor
After you know which tables you want to migrate In - Memory OLTP, you can use the tool AMR Memory Optimization Advisor to help with the migration process. To access this advisory, open Object Explorer in SQL Server Management Studio and go to the table you want to migrate. Right-click the table and choose Memory Optimization Advisor.
The consultant will launch with the introduction page, which you can read or skip. When you click Next to go to the checklist optimization Migration page, where the consultant check to see if you can migrate to your table. If one or more elements of the validation fails, the migration process will stop. If necessary, you can generate a report for this analysis. If all you see are green markings, the table has no features that could prevent the migration process, in which case you can proceed to the next page.
Notes on the page for the optimization of migration, you will find important information on what is not supported on table’s optimized memory and other problems. The problems listed in the table may not prevent migration, but can cause other objects to fail or behave unexpectedly.
- Name of memory-optimized file group (only one memory-optimized file group is allowed per instance)
- Logical filename
- Path where the logical file will be saved
- New name given to the original table (the original table is renamed to prevent naming conflicts)
You can also choose to copy data from the original table to the new memory-optimized table during the migration process, and you can change the durability of the memory-optimized table. By default, its DURABILITY option will be set to schema_and_data, but you can change it to schema_only by selecting the Check this box to migrate this table to a memory-optimized table with no data durability option. If you do so, the data will be lost after the SQL Server service is restarted. In other words, just the table's schema is persistent.
Finally, the Review Optimization Options page shows the estimated current memory cost for the memory-optimized table. If there isn't sufficient memory, the migration process might fail.
Once you're done with the Review Optimization Options page, you can click Next to go to the Review Primary Key Conversion page. When the migration process begins, it will start by converting the primary key. You can convert it to:
- A nonclustered hash index, which gives the best performance for point lookups. If you select this option, you also need to specify the bucket count, which should be twice the expected number of rows.
- A nonclustered index, which gives the best performance for range predicates.
For each index you have in the table being migrated, you'll be presented with a Review Index Conversion page that has been populated with the columns and data types for that index. The options you can configure in the Review Index Conversion page are similar to those in the Review Primary Key Conversion page. In this case, for the indexed column Person_OnDisk_Name with the unsupported French_CI_AS collation, you'd have to select BIN2 collation as the Char data type.
On the Verify Migration Actions page, you'll see all operations that will be performed to migrate your table to In-Memory OLTP. You have the option to script those operations by clicking the Script button. After verifying all the options, you can click the Migrate button to start the migration process.
Figure 10 shows how the new memory-optimized table appears in SSMS. If you view its properties, you'll see that the Memory optimized property is set to True and that the schema and data are durable for this table.
Using the Native Compilation Advisor
After learning that the stored procedures to migrate to
in-memory OLTP, you can use Native Compilation Advisor AMR tool to help with
your migration. To access this advisory, open Object Explorer in SSMS and go to
the stored procedure to migrate. Right-click the stored procedure and select
Native Compilation Advisor.
After clicking through the Welcome page, you are presented
with validation stored procedure, which will give warnings if the stored
procedure contains some elements of T -SQL that are not compatible with native
compilation page. If the stored procedure is valid, it can become a stored
procedure natively compiled without modification. However Procedures, the
Advisory Native Compilation does not migrate stored as the Optimization Advisor
migrates memory tables. You have to do the migration on their own.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.