Showing posts with label Monitoring. Show all posts
Showing posts with label Monitoring. Show all posts

Tuesday, May 12, 2015

T-SQL Tuesday #66: Monitoring

This month's T-SQL Tuesday is being hosted by Cathrine Wilhelmsen (b|t), and the topic is as follows: Monitoring.

I thought I'd finally give it a go by joining the tradition of T-SQL Tuesday. The last months or so I kept saying to myself: this topic is not particularly suited for me, or I don't have the time right now.. As most people say about blogging: not starting is usually not blogging at all, so here it goes!

Since I'm a SQL/BI developer I'm not overly into monitoring servers or SQL-instances, but after thinking about it I do have some other things to monitor during my day-to-day tasks.

SSIS
I tend to execute quite a few packages and jobs during the day. I monitor them via the Standard Report option of the SSISDB Catalog in SSMS as that's the easiest way. Following naming conventions for my packages, tasks and components I can quickly address issues and see where things go bad.

SQL
I also regularly take a look at the servers that are running my solutions to keep everything going smoothly. Until now it's mainly been reactive, e.g. when users start complaining about slow reports or dashboards.
I use the below query for finding the top slow queries that have run lately.

SELECT TOP 20 SUBSTRING(qt.TEXT, (qs.statement_start_offset / 2) + 1, (
   (
    CASE qs.statement_end_offset
     WHEN - 1
      THEN DATALENGTH(qt.TEXT)
     ELSE qs.statement_end_offset
     END - qs.statement_start_offset
    ) / 2
   ) + 1) AS SQLStatement
 ,qs.execution_count
 ,qs.total_logical_reads
 ,qs.last_logical_reads
 ,qs.total_logical_writes
 ,qs.last_logical_writes
 ,qs.total_worker_time
 ,qs.last_worker_time
 ,qs.total_elapsed_time / 1000000 total_elapsed_time_in_Sec
 ,qs.last_elapsed_time / 1000000 last_elapsed_time_in_Sec
 ,qs.last_execution_time
 ,qp.query_plan
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) qt
CROSS APPLY sys.dm_exec_query_plan(qs.plan_handle) qp
ORDER BY qs.total_logical_reads DESC -- logical reads
-- ORDER BY qs.total_logical_writes DESC -- logical writes
-- ORDER BY qs.total_worker_time DESC -- CPU time

Besides getting slow queries that have already run, it sometimes happens that a DWH-job is still running in the morning when it should've been completed looong ago :). Running sp_who2 or sp_WhoIsActive by Adam Machanic (b|t) usually gets me to find the reason why it is still running and what the problem is.

Hopefully I can continue to add more posts to the T-SQL Tuesday tradition. Have a nice day and keep sharing!

Friday, July 8, 2011

Monitoring BizTalk with BizTalk360

When I was attending the last BizTalk User Group meeting in Almere the 29th of June, it was the first time I heard of this product: BizTalk360. It describes itself as "World's first web based BizTalk production support / monitoring tool". It immediately got my attention.
At the BTUG, Saravana Kumar was representing Kovai Ltd., the company behind BizTalk360. The funny part about this is, just a few hours ago I saw that his Preparation Diary for BizTalk 2006 R2 helped me get that MCTS! Thanks for that :-)

Now on to the product, it's like the BizTalk Admin Console, but then much better.
BizTalk360 Environment dashboard
- Environment dashboard: an overview of all the properties, you could compare this to the Group Hub page.
- Fine grained authorization: users can be given different access rights to the dashboard (read-only, certain applications, various parts of the dashboard).
- Topology diagram: this automatically provides the physical network topology of the BizTalk group. It recognizes BizTalk and SQL-servers and wether it is a web / database / application server.
- Governance / auditing: all the key activities performed by users can be taken care of. You never have to wonder who stopped that Send Port or who restarted the Host Instance.


Next to that, there are also the Advanced Event Viewer, the BizTalk Server dashboard, the fact that it is web-based and lots of other things. I won't go explaining all the features, you should just check it out yourself. Or, if you don't have any development server set up, you could also try their Demo, which is made possible through Windows Azure with the AppFabric service bus.

As we are just in the middle of getting to manage more BizTalk applications, BizTalk360 could have come at the right time and the right place. I am definitely going to have a look into the dev-edition and see what I can get out of it. I think our administrators would love to use it too.

Featured Post

Fabric Quality of Life Update: No More Default Semantic Models!

Another quick post, because today is an important day for everyone working with Fabric and Power BI! Last month, Microsoft announced they ar...