We have experienced some general slowness issues and the only metric I have ever had a bad feeling about is our logins per second. Most of the Articles I find on the internet on the subject are for older versions. We are on SS 2016.
During the day we are seeing average logins at upwards of 1000 to 3000 logins per second. On as system with less then 100 active users.
Anybody know of a way to better monitor what is going on? DMV etc? Or if this is even a problem?
what are these logins you talk about?
is your application internal use only or is it exposed to the external www.
Do you have some sort of application using your database? is it a web app or a client app? .net or what?
How long has your SQL Server been up and running? These values are cumulative, they are not logins at point of running the query. and you know there are a whole lot more processes that are tapping your database than just your web application. What you will need to do is weed out superfluous logins and focus on logins by the app pool service account?
Anyways, are you doing full backups: bak and logs? Have you identified which artifacts are causing issues?
Do you use linq and entity framework with your c# and/or do you use stored procedures?
really, why?
Slowness issues with the application get worse over time. Programmers wanted a weekly reboot. All indications are the SQL Server is fine, I don't think it needs a reboot but they want it.
Anyways, are you doing full backups: bak and logs?
Yes, full backups
Have you identified which artifacts are causing issues?
No, I can't find any problem on the SQL Server side, I am looking for any clues.
Do you use linq and entity framework with your c# and/or do you use stored procedures?
Not sure about linq and entity framework but we do use stored procedures.
Are there any other type of applications, data reads for reports, queries by users using query tools and bypassing application accessing the data? OLAP vs OLTP
The trick is to know whether they are pool or non-pooled connections. If they are pooled connections that's fine. If they are non-pooled then that can be an issue.
You can run Profiler for the Login Event and look at the event subclass. That will tell you pooled vs. non-pooled. Profiler will also tell you which applications are doing the logins.
A pooled login is a very lightweight way to reuse a SQL Server connection.
With that many connections I'd only run it for a minute or two outside of business hours.
That sounds like possible "connection leaks" and "memory leaks" from software that doesn't actually close out what it uses. Also, you need to have something that updates statistics properly.
Yowch... 30,000 logins in 6 or 7 minutes. That's pretty hefty. The Logins/sec counter might be an indication of something weird like establishing a connection, getting one piece of information, closing the connection, wash-rinse-repeat, for every object on multiple screens. You'd have to look at the actual calls on the database to see what's happening there.
As for "slow" queries... the slowest queries are actually usually pretty fast. It's when there are thousands of nearly identical queries that take far less than 600 milliseconds. The real killer with those is if they have to compile for each slight change. We had such a thing going on... < 200 ms to execute but 2 to 22 SECONDS to compile for each and every run. Of course, that also made proc cache go absolutely nuts. We moved the query from being EF generated to a stored procedure and optimized it. We got the execution time down to < 10ms and was totally reusable so virtually no compile time except on the first run and that was certainly < 2 Seconds.