Thursday, December 28, 2017

Converting a Multidimensional SSAS Instance to a Tabular Instance

Did you ever have a situation where you felt like converting a Sql Server Analysis Services in Multidimensional Mode to Tabular Mode ?

I thought of trying this out after reading this article.  In this article from Microsoft, there is Caution note as below.


I thought let me try what happens if I changed the deploymentMode and see whether this changes my Multidimensional Mode to Tabular Mode.  Of course I tried this on my dev instance of SSAS 2016.

So first things first –
  1. Backed up my one and only one worldwide SSAS database.
  2. Detached the database
  3. Checked the server mode in the SSAS properties as shown below.

  4. Disconnected from my SSAS instance.

  5. Below are the steps I followed to convert the multidimensional SSAS model to Tabular Model

  6. Went to location -- C:\Program Files\Microsoft SQL Server\MSAS13.DEVELOPER\OLAP\Config
  7. Made a copy of the MSMDSRV.INI file
  8. Then opened notepad as Administrator as shown below




  9. Then from notepad opened the MSMDSRV.INI file
  10. Changed the deploymentmode from 0 to 2 as shown below 
  11.  Restarted the Analysis Services as shown below.  

  12. Then connected to the SSAS instance in Management Studio and checked the properties.
  13. The Server Model is showing as Tabular as shown below
   
Now I am expecting to build a Tabular SSAS Model on this SSAS instance and check whether this works. 

I know I cannot reattach the worldwide multidimensional model database I saved earlier.

More in a next blog whether I have been successful or not.



Thursday, December 21, 2017

Why are my SQL jobs failing ? -- High Memory/ CPU usage ?....

This week I had a situation where a job failed on a Monday and on a Wednesday.  When I reran the job later during the day, the job ran successfully.

When the job failed, my systems administrator got an alert for more than 90% memory usage on the server that runs the job.

He provided me a graph that showed the peak usage as shown below.





This prompted me to run a script that gives me a list of all the jobs for the last 3 days along with the duration they took to run the job.

I used the table MSDB.dbo.sysjobhistory.  In this table the date and time columns are stored as two different columns of type int -- Run_date and run_time.
Furthermore the run_time column is stored in hhmmss format but is not always 6 digits.

But the good news is there is a function available that we can use to get the rundatetime which is called msdb.dbo.agent_datetime.  This takes two arguments run_date and run_time.

There is also a run_duration column which is stored as type INT.  This column is a duration stored in hhmmss format similar to run_time column except that this is a duration.  So for example 208 indicates that the job took 2 min 8 secs.
So I had to do some transformation to the duration in minutes as shown in the script below.



select 
 j.name as 'JobName',
 h.step_name,
 msdb.dbo.agent_datetime(run_date, run_time) as 'RunDateTime',
 ((run_duration/10000*3600 + (run_duration/100)%100*60 + run_duration%100 + 31 ) / 60) 
         as 'RunDurationMinutes'
From msdb.dbo.sysjobs j 
INNER JOIN msdb.dbo.sysjobhistory h 
 ON j.job_id = h.job_id 
where j.enabled = 1  
order by JobName, RunDateTime desc


I took the results and analysed to see what are the peak times the jobs run.  The peak times are shown in the graph below for the three days.



I did some further analysis and decided to move the schedule of the jobs that are overlapping to run at a different time so that the server is not loaded at that time.

I will check the memory usage tomorrow and see if there are any failures.

Are there any other ways you can analyse the CPU/Memory usage ?  Look forward for your comments.


Wednesday, December 06, 2017

INSERT EXEC failed because the stored procedure altered the schema of the target table.

This morning I was trying to call a stored procedure and insert the results in a temp table using the following code  --


 create table #table1( 
   [col1] varchar(255) null, 
   [col2] varchar(255) null, 
   [col3] varchar(100) null,
   [col4] varchar(6) null,
            [col5] int null
            )

 Insert into #table1
 (
 [col1] ,
   [col2], 
   [col3] ,
   [col4] ,
            [col5]
 )
 EXEC [dbo].[stored_proc1]
 @Region = N'Canada'

I got the error as below:
Msg 556, Level 16, State 3, Line 36
INSERT EXEC failed because the stored procedure altered the schema of the target table.

When I just run the stored procedure without inserting into the temp table, the stored procedure runs fine and gives the results.

I tried to look at the order of the columns of the temp table I am inserting into, and also the temp table column configurations.  Everything looked the same.

So I started investigating the code in the stored procedure.  I found one clue that could be causing this.  The temp table in the store procedure has the same name as the temp table I am inserting the results of the stored procedure
So I changed the temp table name to table2 and then the results of the stored procedure were inserted into the new temp table table2.

So the final code is as follows:



 create table #table2( 
   [col1] varchar(255) null, 
   [col2] varchar(255) null, 
   [col3] varchar(100) null,
   [col4] varchar(6) null,
            [col5] int null
            )

 Insert into #table2
 (
 [col1] ,
   [col2], 
   [col3] ,
   [col4] ,
            [col5]
 )
 EXEC [dbo].[stored_proc1]
 @Region = N'Canada'

Deploy the Azure Machine Learning Model

In the previous post I have discussed how to create an Azure Machine Model.  In this post I will be discussing how to Deploy this model. Pre...