<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Script Library Archives - The SERO Group</title>
	<atom:link href="https://theserogroup.com/tag/script-library/feed/" rel="self" type="application/rss+xml" />
	<link>https://theserogroup.com/tag/script-library/</link>
	<description>SQL Servers Healthy, Secure, And Reliable</description>
	<lastBuildDate>Tue, 20 Jan 2026 19:40:14 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9</generator>

 
<site xmlns="com-wordpress:feed-additions:1">121220030</site>	<item>
		<title>How to Find Queries Causing RESOURCE_SEMAPHORE Waits in SQL Server</title>
		<link>https://theserogroup.com/sql-server/how-to-find-queries-causing-resource_semaphore-waits-in-sql-server/</link>
		
		<dc:creator><![CDATA[Lee Markum]]></dc:creator>
		<pubDate>Wed, 21 Jan 2026 13:00:54 +0000</pubDate>
				<category><![CDATA[SQL Server]]></category>
		<category><![CDATA[Database]]></category>
		<category><![CDATA[Database Administration]]></category>
		<category><![CDATA[Database Development]]></category>
		<category><![CDATA[IT Manager]]></category>
		<category><![CDATA[Query Store]]></category>
		<category><![CDATA[Script Library]]></category>
		<category><![CDATA[SQL]]></category>
		<category><![CDATA[SQL Server Management]]></category>
		<guid isPermaLink="false">https://theserogroup.com/?p=7713</guid>

					<description><![CDATA[<p>The resource_semaphore wait can have devastating consequences for SQL Server performance. This wait essentially means that some of the queries in your workload have memory grants that are larger than the memory for the server can support. When that happens, the SQL Server feels like it is frozen and unresponsive. Queries are likely running, but&#8230; <br /> <a class="read-more" href="https://theserogroup.com/sql-server/how-to-find-queries-causing-resource_semaphore-waits-in-sql-server/">Read more</a></p>
<p>The post <a href="https://theserogroup.com/sql-server/how-to-find-queries-causing-resource_semaphore-waits-in-sql-server/">How to Find Queries Causing RESOURCE_SEMAPHORE Waits in SQL Server</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>The resource_semaphore wait can have devastating consequences for SQL Server performance. This wait essentially means that some of the queries in your workload have memory grants that are larger than the memory for the server can support. When that happens, the SQL Server feels like it is frozen and unresponsive. Queries are likely running, but this wait causes a queue to build up while submitted queries wait for memory to run.</p>



<h1 class="wp-block-heading" id="h-what-is-the-resource-semaphore-wait">What is the resource_semaphore wait?</h1>



<p>A wait type of resource_semaphore means that there isn&#8217;t enough available memory to grant for queries to run. At a high level, here is what is happening. A query is submitted to the SQL Server engine for execution. As part of the pre-execution phase, SQL Server estimates how much memory it thinks a query will need to run. Several factors influence this memory estimation. Assuming there is free memory to grant to the query, then the query moves along the execution phases and starts running.</p>



<p>But let’s say you have a server with 128 GB of RAM allocated to SQL Server. A series of queries are submitted to the SQL Server database engine that are each granted 15 GB of RAM. At most, SQL Server can handle 8 of those queries before it runs out of memory to allocate. The next query that comes along and needs another 15 GB of RAM is prevented from starting its execution because 8 X 15 = 120 GB of RAM. This 9th query, if granted memory, would cause a total of 135 GB to be allocated. The server doesn’t have that much RAM allocated for queries. So, it has to wait.</p>



<p>As other queries are submitted, they wait behind this 9th query that needs the additional 15 GB of RAM. If the other 8 queries that are already executing are long-running queries, it might be several minutes, or longer, before memory is available. Queries start stacking up behind each other. Users start noticing that the app is slow, pages aren’t refreshing, and reports aren’t completing. Soon, everyone is hitting F5 on the web app to resubmit queries because nothing is happening. Immediately following this, your phone or your Slack messages start blowing up!</p>



<h2 class="wp-block-heading" id="h-how-to-find-queries-with-large-memory-grants">How to find queries with large memory grants</h2>



<p>First, you can use sp_whoisactive to find the queries and the offending wait in real-time. This is very useful for the scenario above, where users feel the pain and start entering tickets and messaging people for help.</p>



<p>However, let’s say, for example, that your server is just teetering on the edge with the memory allocation. Memory grants are occasionally high enough that SQL Server is registering the resource_semaphore wait, but it isn’t happening for long enough, or frequently enough, that users really notice and start complaining. Your SQL Server is experiencing slowness, at times, it’s just not causing excruciating pain. This may show up by this wait appearing low in the result set from Paul Randall’s wait stats query. Maybe it’s only causing a few seconds of wait, on average, when it happens. Users might notice this because their report or application screen completes after a brief wait, so they assume this is “normal.” Consequently, they don’t report it. This doesn’t mean you can or should ignore what the wait stats information is telling you. Act now before this becomes a full-blown emergency.</p>



<p>Second, in the above scenario, you can use Extended Events to find queries with large memory grants. Extended Events are light-weight trace objects that allow for the capture of far more events than Profiler or a server-side trace. They also work differently by only firing and capturing when an event defined in the session happens, versus capturing everything and then filtering like Profiler does.</p>



<h2 class="wp-block-heading" id="h-setting-up-the-extended-events-session">Setting Up The Extended Events Session</h2>



<p>After some poking around and some experimenting, I was able to arrive at the T-SQL below to create the extended event session. This code will capture any query with a memory grant greater than 1 GB. Adjust this higher or lower as makes sense for your environment. The session stores the database ID, plan handle, session ID, and the T-SQL text for the offending query in a file. It will write to as many as 5 files, each 1 GB in size. When the 5th file is full, it will delete the oldest files and start writing a new file.</p>



<p>One thing to be aware of is the path for the file that will hold the data. That path must exist first. I&#8217;m using C:\XE\NameofExtendedEventSession.xel. Be sure to update that path to a location that your SQL Server instance can access. </p>



<pre class="wp-block-code"><code>CREATE EVENT SESSION &#91;TrackHighMemoryGrants] ON SERVER 
ADD EVENT sqlserver.query_memory_grant_usage(
ACTION(sqlserver.database_id,sqlserver.plan_handle,sqlserver.session_id,sqlserver.sql_text)
    WHERE (&#91;granted_memory_kb]&gt;(1024000))),
ADD EVENT sqlserver.query_memory_grant_wait_end(
    ACTION(sqlserver.database_id,sqlserver.session_id,sqlserver.sql_text)
    WHERE (&#91;sqlserver].&#91;database_id]=(8) AND &#91;granted_memory_kb]&gt;(1024000)))
ADD TARGET package0.event_file(SET filename=N'C:\XE\HighMemoryGrants.xel',max_file_size=(1024),max_rollover_files=(5))
WITH (MAX_MEMORY=51200 KB,EVENT_RETENTION_MODE=ALLOW_SINGLE_EVENT_LOSS,MAX_DISPATCH_LATENCY=5 SECONDS,MAX_EVENT_SIZE=0 KB,MEMORY_PARTITION_MODE=NONE,TRACK_CAUSALITY=ON,STARTUP_STATE=OFF)
GO</code></pre>



<p>To start the Extended Events session, run the following T-SQL:</p>



<pre class="wp-block-code"><code>ALTER EVENT SESSION &#91;TrackingHighMemoryGrants]
ON SERVER
STATE = START;</code></pre>



<p>This can also be done from SSMS. Traverse the UI under the SQL instance name to Management &gt; Extended Events &gt; “TrackingHighMemoryGrants”, right click and select the “Start Session” option.</p>



<h2 class="wp-block-heading" id="h-how-to-query-extended-events-files">How to query Extended Events files</h2>



<p>The extended event is gathering data. The query below can parse the collected files to show which queries have the highest memory grants on average.</p>



<pre class="wp-block-code"><code>
WITH ParsedEvents AS (
SELECT 
event_data.value('(event/action&#91;@name="sql_text"]/value)&#91;1]', 'nvarchar(max)') AS sql_text,
event_data.value('(event/data&#91;@name="granted_memory_kb"]/value)&#91;1]', 'bigint') / 1024.0 AS granted_mb
FROM (
    SELECT CAST(event_data AS XML) AS event_data
    FROM sys.fn_xe_file_target_read_file('C:\XE\HighMemoryGrants*.xel', NULL, NULL, NULL)
    ) AS x
)

SELECT 
sql_text,
COUNT(*) AS execution_count,
AVG(granted_mb) AS avg_granted_mb,
MAX(granted_mb) AS max_granted_mb,
MIN(granted_mb) AS min_granted_mb
FROM ParsedEvents
GROUP BY sql_text
ORDER BY avg_granted_mb DESC;
</code></pre>



<p><br>Now you can start dealing with the queries involved in that pesky wait before the problem brings your server to a screeching halt! By the way, the queries may lead you to a design problem in your tables that can cause high memory grants. More about that in a future post!<br></p>



<h2 class="wp-block-heading" id="h-want-to-work-with-the-sero-group">Want to work with The SERO Group?</h2>



<p>Want to learn more about how The SERO Group helps organizations take the guesswork out of managing their SQL Servers? <a href="https://theserogroup.com/contact-us/" target="_blank" rel="noreferrer noopener">Schedule a no-obligation discovery call</a>&nbsp;with us to get started.</p>
<p>The post <a href="https://theserogroup.com/sql-server/how-to-find-queries-causing-resource_semaphore-waits-in-sql-server/">How to Find Queries Causing RESOURCE_SEMAPHORE Waits in SQL Server</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">7713</post-id>	</item>
		<item>
		<title>Why Quiet Reflection Leads to Better IT Strategy Decisions</title>
		<link>https://theserogroup.com/azure/why-quiet-reflection-leads-to-better-it-strategy-decisions/</link>
		
		<dc:creator><![CDATA[Joe Webb]]></dc:creator>
		<pubDate>Wed, 17 Dec 2025 13:00:04 +0000</pubDate>
				<category><![CDATA[Azure]]></category>
		<category><![CDATA[Data Security]]></category>
		<category><![CDATA[DBA]]></category>
		<category><![CDATA[Events]]></category>
		<category><![CDATA[Professional Development]]></category>
		<category><![CDATA[SQL Community]]></category>
		<category><![CDATA[SQL Server]]></category>
		<category><![CDATA[SQL Server Consulting]]></category>
		<category><![CDATA[The Sero Group]]></category>
		<category><![CDATA[Clustering]]></category>
		<category><![CDATA[Clusters]]></category>
		<category><![CDATA[Database]]></category>
		<category><![CDATA[Database Administration]]></category>
		<category><![CDATA[Database Development]]></category>
		<category><![CDATA[IT Manager]]></category>
		<category><![CDATA[Microsoft Azure]]></category>
		<category><![CDATA[Public Speaking]]></category>
		<category><![CDATA[Script Library]]></category>
		<category><![CDATA[Sero]]></category>
		<category><![CDATA[Sero Group]]></category>
		<category><![CDATA[Serogroup]]></category>
		<category><![CDATA[Shared Disks]]></category>
		<category><![CDATA[SQL]]></category>
		<category><![CDATA[SQL Assessment]]></category>
		<category><![CDATA[SQL Audit]]></category>
		<category><![CDATA[SQL Conference]]></category>
		<category><![CDATA[SQL Consultant]]></category>
		<category><![CDATA[SQL Events]]></category>
		<category><![CDATA[SQL Security]]></category>
		<category><![CDATA[SQL Server Consultant]]></category>
		<category><![CDATA[SQL Server Management]]></category>
		<category><![CDATA[SQL Training]]></category>
		<category><![CDATA[TempDB]]></category>
		<guid isPermaLink="false">https://theserogroup.com/?p=7691</guid>

					<description><![CDATA[<p>Last Saturday, I woke up before dawn to a quiet house. My family was still asleep, as I’m the only morning person in our household. The Christmas tree lights cast a warm glow across the room, and I was alone with my thoughts and a hot cup of coffee. No urgent emails, no fire drills,&#8230; <br /> <a class="read-more" href="https://theserogroup.com/azure/why-quiet-reflection-leads-to-better-it-strategy-decisions/">Read more</a></p>
<p>The post <a href="https://theserogroup.com/azure/why-quiet-reflection-leads-to-better-it-strategy-decisions/">Why Quiet Reflection Leads to Better IT Strategy Decisions</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Last Saturday, I woke up before dawn to a quiet house. My family was still asleep, as I’m the only morning person in our household. The Christmas tree lights cast a warm glow across the room, and I was alone with my thoughts and a hot cup of coffee. No urgent emails, no fire drills, no meetings starting in five minutes. Just space to think.</p>



<p>As I sat there, I ended up reflecting back on 2025. I found myself gravitating to these three questions:</p>



<ul class="wp-block-list">
<li>What went well this year?</li>



<li>What did I learn?</li>



<li>What should I focus on next year?</li>
</ul>



<p>If you’re a leader, I’m guessing you rarely get this kind of thinking time during your workday. I know I don’t. Our calendars are packed with calls, team meetings, and those &#8220;quick questions” that turn into two-hour troubleshooting sessions.</p>



<p>But here&#8217;s what I&#8217;ve learned: <strong>the quality of your strategic decisions is directly tied to the quality of your thinking time.</strong></p>



<p>And thinking time doesn&#8217;t happen by accident. You have to protect it.</p>



<h3 class="wp-block-heading" id="h-what-went-well-this-year">What Went Well This Year?</h3>



<p>When I asked myself this question, I didn&#8217;t think about our biggest projects or flashiest achievements. I didn&#8217;t think about when we migrated almost 2,000 databases as part of an upgrade project. Or the performance tuning we did that resulted in a $36,000 reduction in annual Azure spend for a client. </p>



<p>Instead, I thought about the relationships we strengthened. The trust we built with clients. The problems we solved before they became crises.</p>



<p>For you, this might look like:</p>



<ul class="wp-block-list">
<li>The audit that went smoothly because your security documentation was solid</li>



<li>The successful disaster recovery test that was possible because you kept refining the process</li>



<li>The team member you mentored who&#8217;s now ready for more responsibility</li>



<li>The support resources you provided your team through a trusted partner</li>
</ul>



<p>These aren&#8217;t always the things that make it into board reports. But they&#8217;re the foundation that everything else is built on.</p>



<h3 class="wp-block-heading" id="h-what-did-i-learn">What Did I Learn?</h3>



<p>This year reminded me of something Eisenhower once said: <strong>&#8220;Plans are worthless, but planning is everything.&#8221;</strong></p>



<p>The need for planning cannot be overstated. It&#8217;s critical. Even if the plan doesn&#8217;t always work out the way you intended. </p>



<p><strong>The plan itself wasn&#8217;t the point. The thinking I did while creating the plan was the point.</strong></p>



<p>Because I’d thought through our capacity, our ideal client profile, and our service delivery model, I could adjust quickly when reality didn’t match my spreadsheet. I knew which opportunities were a good fit for us and which ones to let go. Because we’ve intentionally built a small but incredibly talented team that genuinely wants to see our clients succeed, we were able to identify and create ways to help them.</p>



<p>I watched the same dynamic play out with clients. The institutions that had documented their SQL Server environments, tested their disaster recovery plans, and mapped their compliance requirements adapted quickly when needed. They were positioned for success even when the unexpected happened.</p>



<p>Planning isn&#8217;t about predicting the future. It&#8217;s about <strong>building the muscle memory to respond when the future surprises you.</strong></p>



<p>What did you learn this year about planning and adapting? Maybe it was:</p>



<ul class="wp-block-list">
<li>That your three-year technology roadmap needs quarterly reviews, not just annual ones</li>



<li>That the disaster recovery plan sitting in a SharePoint folder isn&#8217;t the same as a tested DR plan</li>



<li>That &#8220;we&#8217;ll address that next quarter&#8221; eventually becomes &#8220;why didn&#8217;t we address this sooner?&#8221;</li>



<li>That having an expert on call beats having a plan to find an expert when something breaks</li>
</ul>



<p>These lessons matter. Write them down. They&#8217;re not just hindsight—they&#8217;re your blueprint for better decisions ahead.</p>



<h3 class="wp-block-heading" id="h-what-should-i-focus-on-next-year">What Should I Focus On Next Year?</h3>



<p>For me, the answer was clear: <strong>I need to help more financial institutions and healthcare organizations understand that they have options.</strong> Most CIOs think they have two choices for database management: hire a full-time DBA (expensive and hard to find) or make do with whoever can &#8220;figure it out&#8221; (risky and unsustainable).</p>



<p>There&#8217;s a third option: fractional DBA services that give you expert oversight without the full-time price tag. </p>



<p>For you, your focus might be different. Maybe it&#8217;s:</p>



<ul class="wp-block-list">
<li>Finally getting your SQL Server environment documented and audit-ready</li>



<li>Building a disaster recovery plan that you&#8217;ve actually tested</li>



<li>Move a little further along the <a href="https://theserogroup.com/data-strategy/sql-server-maturity-curve-how-banks-move-from-reactive-risk-to-strategic-advantage/">SQL Server Maturity Curve</a></li>



<li>Finding a partner who understands banking compliance, not just databases</li>
</ul>



<p>Whatever it is, the key is to actually choose something. Not everything. Something. And move toward it. Make progress.</p>



<h3 class="wp-block-heading" id="h-the-power-of-quiet-reflection">The Power of Quiet Reflection</h3>



<p>Here&#8217;s the thing about those early Saturday morning moments: they&#8217;re rare. And precious. </p>



<p>During the week, we’re in execution mode. We’re responding, reacting, solving, and fixing. That’s necessary work. But it’s not strategic work.</p>



<p>Strategic work requires space. It requires stepping back from the urgent to focus on the important.</p>



<p>So, here&#8217;s my challenge to you as we wind down 2025 and usher in the new year:</p>



<h3 class="wp-block-heading" id="h-block-off-time-just-to-think-then-protect-it">Block Off Time Just to Think, Then Protect It</h3>



<p>Maybe it&#8217;s Saturday mornings before your family wakes up. Maybe it&#8217;s a long walk at lunch. Maybe it&#8217;s 90 minutes with your calendar blocked and your office door closed. </p>



<p>Whatever it is, protect it. The decisions you make during that quiet time about where to focus, what risks to address, and which partnerships to invest in will help shape your entire year.</p>



<h3 class="wp-block-heading" id="h-your-turn">Your Turn</h3>



<p>As you think about the year ahead, I&#8217;d encourage you to ask yourself those three questions:</p>



<ol class="wp-block-list">
<li>What went well this year? Celebrate it. Learn from it.</li>



<li>What did I learn? Write it down. It&#8217;s wisdom you paid for.</li>



<li>What should I focus on next year? Pick one or two things. Not everything.</li>
</ol>



<p>And if one of those focus areas is &#8220;finally get our SQL Server environment to a place where I&#8217;m confident, not just hopeful,&#8221; let&#8217;s talk. That&#8217;s exactly what we help institutions do.</p>



<p>If you&#8217;re a CIO wondering whether your SQL Server environment is as healthy and secure as it should be, I&#8217;d be happy to have a conversation. No sales pitch. Just two people talking candidly about database management. <a href="https://theserogroup.com/contact-us/" target="_blank" rel="noreferrer noopener">Schedule a time here</a>.</p>
<p>The post <a href="https://theserogroup.com/azure/why-quiet-reflection-leads-to-better-it-strategy-decisions/">Why Quiet Reflection Leads to Better IT Strategy Decisions</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">7691</post-id>	</item>
		<item>
		<title>How to Enable Query Store in SQL Server: A Step-by-Step Guide</title>
		<link>https://theserogroup.com/sql-server/how-to-enable-query-store-in-sql-server-a-step-by-step-guide/</link>
		
		<dc:creator><![CDATA[Lee Markum]]></dc:creator>
		<pubDate>Wed, 12 Nov 2025 13:00:05 +0000</pubDate>
				<category><![CDATA[SQL Server]]></category>
		<category><![CDATA[Database]]></category>
		<category><![CDATA[Database Administration]]></category>
		<category><![CDATA[Database Development]]></category>
		<category><![CDATA[IT Manager]]></category>
		<category><![CDATA[Query Store]]></category>
		<category><![CDATA[Script Library]]></category>
		<category><![CDATA[SQL]]></category>
		<category><![CDATA[SQL Script Library]]></category>
		<category><![CDATA[SQL Server Management]]></category>
		<guid isPermaLink="false">https://theserogroup.com/?p=7614</guid>

					<description><![CDATA[<p>In my previous post about Query Store, I wrote about the four key benefits to enabling Query Store. Now that I&#8217;ve convinced you to turn it on, how do you do that? One thing to point out is that in SQL Server 2022 and above, when creating a new database from the SSMS GUI or&#8230; <br /> <a class="read-more" href="https://theserogroup.com/sql-server/how-to-enable-query-store-in-sql-server-a-step-by-step-guide/">Read more</a></p>
<p>The post <a href="https://theserogroup.com/sql-server/how-to-enable-query-store-in-sql-server-a-step-by-step-guide/">How to Enable Query Store in SQL Server: A Step-by-Step Guide</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>In <a href="https://theserogroup.com/dba/4-key-performance-benefits-of-enabling-query-store/" target="_blank" rel="noreferrer noopener">my previous post about Quer</a><a href="https://theserogroup.com/dba/4-key-performance-benefits-of-enabling-query-store/">y Store</a>, I wrote about the four key benefits to enabling Query Store. Now that I&#8217;ve convinced you to turn it on, how do you do that?</p>



<p>One thing to point out is that in SQL Server 2022 and above, when creating a new database from the SSMS GUI or by simply using the CREATE DATABASE MyNewDB syntax, the Query Store option will be on by default. For databases restored to SQL Server 2016 or later, the Query Store&#8217;s status from the original system will remain unchanged when the database is restored on the new instance.</p>



<p>Let&#8217;s go through the three ways to enable Query Store.</p>



<ol class="wp-block-list">
<li>Manually in SQL Server Management Studio</li>



<li>Using T-SQL</li>



<li>Using PowerShell</li>
</ol>



<h3 class="wp-block-heading" id="h-1-enabling-query-store-using-sql-server-management-studio">1. Enabling Query Store using SQL Server Management Studio:</h3>



<p>Since you’re likely already comfortable using SQL Server Management Studio for queries and database maintenance, SMSS does offer a convenient, familiar method for getting started with Query Store.</p>



<h4 class="wp-block-heading" id="h-steps-to-enable-query-store-using-ssms">Steps to Enable Query Store using SSMS</h4>



<ol class="wp-block-list">
<li>Connect to a SQL Server instance running SQL Server 2016 or higher.</li>



<li>Click the &#8216;+&#8217; sign next to the Databases folder to expand and see the list of databases.</li>



<li>Right-click on the database name and select &#8220;Properties.&#8221;</li>



<li>Left-click the &#8220;Query Store&#8221; option on the left-hand side of the GUI.</li>



<li>Change the Operation Mode(Requested) option from &#8220;Off&#8221; to &#8220;Read write.&#8221;</li>



<li>Click OK to apply the change and enable Query Store.</li>
</ol>



<h4 class="wp-block-heading" id="h-further-details">Further Details</h4>



<p>Here is what you will see after step 3. The Query Store option mentioned in step 4 is at the bottom of the list of options, like the below.</p>



<figure class="wp-block-image size-full"><a href="https://theserogroup.com/wp-content/uploads/2025/11/SelectingQueryStoreOptionInSelectAPage.png"><img decoding="async" width="172" height="216" src="https://theserogroup.com/wp-content/uploads/2025/11/SelectingQueryStoreOptionInSelectAPage.png" alt="" class="wp-image-7617"/></a></figure>



<p>Left-clicking that Query Store option will cause the below to show up on the right of the SSMS GUI.</p>



<p>What you see when you do that are the existing defaults on 2019 and above. If you are enabling Query Store on versions 2016 or 2017, you will want to adjust additional defaults. Prior to 2019, the default for &#8220;Query Store Capture Mode&#8221; was &#8220;All.&#8221; Change this option to &#8220;Auto&#8221; instead.</p>



<p>Furthermore, the default for Max_Storage_Size_MB was far too low in 2016 and 2017 and could be better in 2019 as well. This value represents the maximum amount of space that Query Store data will occupy in the database in which it was enabled. A good default value to start with is 2048 MB. It may be necessary to adjust that to 4096 MB at the high end in order to capture queries for the entire length of the &#8220;Stale Query Threshold (Days)&#8221; value.</p>



<p>The &#8220;Stale Query Threshold (Days)&#8221; option controls how many days of Query Store data will be kept. If the max storage size is set too low for a retention value of 30 days, then Query Store will start deleting collected data in the system tables to keep Query Store below the max storage size. This could result in having less data available than you intend for troubleshooting.</p>



<p>The rest of the defaults are acceptable and so could be left alone without concern.</p>



<figure class="wp-block-image size-full"><a href="https://theserogroup.com/wp-content/uploads/2025/11/QueryStore2019DefaultsInSSMS-1.png"><img fetchpriority="high" decoding="async" width="699" height="488" src="https://theserogroup.com/wp-content/uploads/2025/11/QueryStore2019DefaultsInSSMS-1.png" alt="" class="wp-image-7621" srcset="https://theserogroup.com/wp-content/uploads/2025/11/QueryStore2019DefaultsInSSMS-1.png 699w, https://theserogroup.com/wp-content/uploads/2025/11/QueryStore2019DefaultsInSSMS-1-300x209.png 300w" sizes="(max-width: 699px) 100vw, 699px" /></a></figure>



<h3 class="wp-block-heading" id="h-2-enabling-query-store-using-t-sql">2. Enabling Query Store using T-SQL</h3>



<p>The T-SQL language is, of course, the language of SQL Server. It is often more flexible than the SSMS GUI. Notice in the screenshot up above that there is a “Script” button. If you click that instead of clicking &#8220;ok&#8221; in the UI, then SQL Server will script out the options in the GUI into a query window. This will allow you to see what the GUI does. Using T-SQL, it is easier to enable Query Store on multiple databases. You can use a construct like sp_msforeachdb to enable Query Store for multiple databases at once.</p>



<pre class="wp-block-code"><code>USE &#91;master]
GO
ALTER DATABASE &#91;MyDB] SET QUERY_STORE = ON
GO
ALTER DATABASE &#91;MyDB] SET QUERY_STORE (OPERATION_MODE = READ_WRITE, MAX_STORAGE_SIZE_MB = 2048)
GO</code></pre>



<h3 class="wp-block-heading" id="h-3-enabling-query-store-using-powershell">3. Enabling Query Store using PowerShell</h3>



<p>Many accidental DBAs, those folks who were “voluntold” to start managing SQL Server, are network engineers, sysadmins, or cloud admins. Automation is often music to their ears, and in the Windows universe, PowerShell is a go-to method for automating tasks. Consequently, using PowerShell to automate the enabling of Query Store may feel natural to accidental DBAs. For the below command, the DBATools module will be needed in your environment.</p>



<p>Below is how Query Store could be enabled on all user databases on an instance of SQL Server. If you only want to enable Query Store on a few select databases on an instance, then add the -Database parameter with a comma-separated list of databases.</p>



<pre class="wp-block-code"><code>Set-DbaDbQueryStoreOption -SqlInstance ServerA -State ReadWrite ​

-FlushInterval 900 -CollectionInterval 60 -MaxSize 4096 ​

-CaptureMode Auto -CleanupMode Auto -StaleQueryThreshold 30, -WaitStatsCaptureMode ON</code></pre>



<p>Also, if your SQL Server environment has the Registered Server feature set up, then PowerShell can be used to read the servers registered there, loop over them, and enable Query Store on all user databases across your environment. This would be done using the Get-DbaRegServer command in the DBATools module.</p>



<h3 class="wp-block-heading" id="h-trace-flags-for-query-store">Trace Flags for Query Store</h3>



<p>If you aren’t familiar with Trace Flags, these are numbers that Microsoft uses to enable certain kinds of behavior in the database engine. They are occasionally meant to be short-term fixes, and later the functionality in a trace flag is built into how the SQL Server database engine works. This is the case for trace flags and Query Store. There are two trace flags to know about and enable. Notice that flag 7752 isn’t needed on SQL Server 2019 and above.</p>



<p>Trace Flag 7745—This prevents Query Store data from writing to disk prior to shutdown or failover process so it doesn’t delay a shutdown or failover.​</p>



<p>Trace Flag 7752 – Loads Query Store data to memory asynchronously from query execution. This default is built into the engine in SQL Server 2019.​</p>



<h3 class="wp-block-heading" id="h-want-to-work-with-the-sero-group">Want to Work With The SERO Group?</h3>



<p>Want to learn more about how The SERO Group helps organizations take the guesswork out of managing their SQL Servers? <a href="https://theserogroup.com/contact-us/" target="_blank" rel="noreferrer noopener">Schedule a no-obligation discovery call</a>&nbsp;with us to get started.</p>
<p>The post <a href="https://theserogroup.com/sql-server/how-to-enable-query-store-in-sql-server-a-step-by-step-guide/">How to Enable Query Store in SQL Server: A Step-by-Step Guide</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">7614</post-id>	</item>
		<item>
		<title>4 Key Performance Benefits of Enabling Query Store</title>
		<link>https://theserogroup.com/dba/4-key-performance-benefits-of-enabling-query-store/</link>
		
		<dc:creator><![CDATA[Lee Markum]]></dc:creator>
		<pubDate>Wed, 15 Oct 2025 12:00:00 +0000</pubDate>
				<category><![CDATA[DBA]]></category>
		<category><![CDATA[Database]]></category>
		<category><![CDATA[Database Administration]]></category>
		<category><![CDATA[Database Development]]></category>
		<category><![CDATA[IT Manager]]></category>
		<category><![CDATA[Query Store]]></category>
		<category><![CDATA[Script Library]]></category>
		<category><![CDATA[Sero]]></category>
		<category><![CDATA[Sero Group]]></category>
		<category><![CDATA[Serogroup]]></category>
		<category><![CDATA[SQL]]></category>
		<category><![CDATA[SQL Consultant]]></category>
		<category><![CDATA[SQL Server]]></category>
		<category><![CDATA[SQL Server Consultant]]></category>
		<category><![CDATA[SQL Server Management]]></category>
		<category><![CDATA[The Sero Group]]></category>
		<guid isPermaLink="false">https://theserogroup.com/?p=7565</guid>

					<description><![CDATA[<p>Query Store has been around since SQL Server 2016, but its full potential often goes untapped. Some companies were initially wary of it after some edge case problems arose during its initial rollout. However, since its initial release, Query Store has undergone numerous enhancements and is rapidly establishing itself as one of the most significant&#8230; <br /> <a class="read-more" href="https://theserogroup.com/dba/4-key-performance-benefits-of-enabling-query-store/">Read more</a></p>
<p>The post <a href="https://theserogroup.com/dba/4-key-performance-benefits-of-enabling-query-store/">4 Key Performance Benefits of Enabling Query Store</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Query Store has been around since SQL Server 2016, but its full potential often goes untapped. Some companies were initially wary of it after some edge case problems arose during its initial rollout. However, since its initial release, Query Store has undergone numerous enhancements and is rapidly establishing itself as one of the most significant advancements in SQL Server, comparable to the SQL Server DMVs introduced in SQL Server 2005.</p>



<p>What are the benefits of enabling Query Store? While there are many technical reasons, here are my top four broad advantages to consider.</p>



<h3 class="wp-block-heading" id="h-1-free-sql-server-monitoring">1. Free SQL Server monitoring</h3>



<p>Your business has already paid for Query Store in the SQL Server licensing. With SQL Server 2016 and later, it is accessible at the database level. This means that for smaller shops that may not have a large enterprise environment, you don&#8217;t have to spend large sums of money to get observability from 3rd party software. Query Store&#8217;s native capture mechanisms can provide significant insight into your SQL Server&#8217;s performance, all without costing you any more money!</p>



<h3 class="wp-block-heading" id="h-2-capture-foundational-sql-server-performance-indicators">2. Capture foundational SQL Server performance indicators</h3>



<p>Query Store collects the data already present in your SQL Server, displaying it in easy-to-understand graphs and reports. With Query Store, values for CPU, memory, duration, and more can be viewed based on MAX/AVG/STD Deviation metrics per query. This provides valuable insights into core metrics that shape the performance of your applications. Furthermore, this data allows your company to see not only how specific queries behaved when there was a performance problem but also to trend those queries over time to see shifts in performance.</p>



<p>SQL Server wait statistics are also captured and displayed in Query Store. When a query needs a resource, like CPU, or data read from disk, then a wait type is assigned to the query. These various waits affect query performance in a multitude of ways, and Query Store surfaces those performance-impacting waits for you. For example, the Query Wait Statistics report may show large bar graphs for BUFFER IO and CPU. Queries appearing in both graphs may be suffering from large table scans because of missing indexes.</p>



<p>Additionally, Query Store captures the query plans associated with queries. Think of query plans as the blueprint for how the query will be executed. These plans contain data about the decisions SQL Server is making about your data and how to process it. Some decisions revealed in the query plan can pinpoint performance issues. For example, query plans that regularly contain table scan operators may indicate missing indexes that force SQL Server to scan millions of rows when it only needs to retrieve a few thousand rows.</p>



<h3 class="wp-block-heading" id="h-3-talk-to-your-vendors-with-data-in-hand">3. Talk to your vendors with data in hand</h3>



<p>COTS vendors need to see hard data when approached with a performance problem. Query Store can provide that data. Without it, you can report a problem, but the software vendor is unlikely to consider making changes.</p>



<p>If you engage a DBA as a Service company, having performance data in hand will go a long way toward building a good relationship with that vendor. They will see your preparedness and be drawn to that. Also, it will allow them to solve your problem faster, and isn&#8217;t that what you really want anyway?</p>



<h3 class="wp-block-heading" id="h-4-allow-your-applications-to-take-advantage-of-new-performance-features">4. Allow your applications to take advantage of new performance features</h3>



<p>Newer versions of SQL Server have a collection of features known as Intelligent Query Processing (IQP). Features such as memory grant feedback, degree of parallelism feedback, and more are tied into IQP. These features depend on Query Store. Without Query Store running and without using the appropriate database compatibility level, your applications are missing out on performance-enhancing features that make queries execute faster, use fewer resources, or do both at the same time.</p>



<h3 class="wp-block-heading" id="h-want-to-work-with-the-sero-group">Want to work with The SERO Group?</h3>



<p>Want to learn more about how The SERO Group helps organizations take the guesswork out of managing their SQL Servers? <a href="https://theserogroup.com/contact-us/" target="_blank" rel="noreferrer noopener">Schedule a no-obligation discovery call</a>&nbsp;with us to get started.</p>
<p>The post <a href="https://theserogroup.com/dba/4-key-performance-benefits-of-enabling-query-store/">4 Key Performance Benefits of Enabling Query Store</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">7565</post-id>	</item>
		<item>
		<title>How to Troubleshoot SQL Server Database Mail Issues Using Built-In View</title>
		<link>https://theserogroup.com/sql-server/how-to-troubleshoot-sql-server-database-mail-issues-using-built-in-view/</link>
		
		<dc:creator><![CDATA[Eric Cobb]]></dc:creator>
		<pubDate>Wed, 23 Apr 2025 12:00:30 +0000</pubDate>
				<category><![CDATA[SQL Server]]></category>
		<category><![CDATA[Database Administration]]></category>
		<category><![CDATA[Script Library]]></category>
		<category><![CDATA[SQL]]></category>
		<category><![CDATA[SQL Server Management]]></category>
		<guid isPermaLink="false">https://theserogroup.com/?p=7334</guid>

					<description><![CDATA[<p>Sending emails from a SQL Server via sp_send_dbmail is common, but troubleshooting problems can be frustrating. Fortunately, SQL Server logs every email attempt, making it easier to find and fix issues. View All Messages Processed By Database Mail The sysmail_allitems view shows every email that Database Mail has tried to process, whether it was successful&#8230; <br /> <a class="read-more" href="https://theserogroup.com/sql-server/how-to-troubleshoot-sql-server-database-mail-issues-using-built-in-view/">Read more</a></p>
<p>The post <a href="https://theserogroup.com/sql-server/how-to-troubleshoot-sql-server-database-mail-issues-using-built-in-view/">How to Troubleshoot SQL Server Database Mail Issues Using Built-In View</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Sending emails from a SQL Server via <a href="https://learn.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/sp-send-dbmail-transact-sql?view=sql-server-ver16"><em>sp_send_dbmail</em></a> is common, but troubleshooting problems can be frustrating. Fortunately, SQL Server logs every email attempt, making it easier to find and fix issues.</p>



<h3 class="wp-block-heading" id="h-view-all-messages-processed-by-database-mail">View All Messages Processed By Database Mail</h3>



<p>The <a href="https://msdn.microsoft.com/en-us/library/ms175056.aspx" target="_blank" rel="noreferrer noopener"><em>sysmail_allitems</em></a> view shows every email that Database Mail has tried to process, whether it was successful or not. This view can help you identify problems by showing you the details of the messages that were sent compared with the messages that weren’t sent by giving you the status of each email.</p>



<p>Here are the status values you’ll see:</p>



<figure class="wp-block-table"><table class="has-fixed-layout"><tbody><tr><td>sent</td><td>The message was successfully sent.</td></tr><tr><td>unsent</td><td>The message is waiting to be sent.</td></tr><tr><td>retrying</td><td>Send attempt failed; retrying soon.</td></tr><tr><td>failed</td><td>The message could not be sent.</td></tr></tbody></table></figure>



<h3 class="wp-block-heading" id="h-view-sent-messages">View Sent Messages</h3>



<p>Use <a href="https://msdn.microsoft.com/en-us/library/ms174372.aspx" target="_blank" rel="noreferrer noopener">sysmail_sentitems</a> to see which messages were successfully sent. Database Mail marks a message as sent when it is successfully submitted and accepted by the SMTP mail server. However, just because an email was accepted by the mail server does not mean it was actually delivered to the recipient. This just means that SQL Server succeeded in giving the message to the email server, and it is up to the email server to actually send the email. If something went wrong after SQL Server passed it off (like a bounce or spam filter snag), you won’t see that here. Those kinds of issues are up to the mail server to handle and won’t show up in SQL Server logs.</p>



<h3 class="wp-block-heading" id="h-view-unsent-messages">View Unsent Messages</h3>



<p>Use the <a href="https://learn.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sysmail-unsentitems-transact-sql?view=sql-server-ver16">sysmail_unsentitems</a> view to find emails still waiting to be sent or retrying after a failure.<br>Messages marked unsent or retrying remain in the mail queue and could be sent at any time.<br>Normally, you will not see many messages here unless there is a delay or email processing issue.</p>



<p>Database Mail marks a message as unsent when it waits in the queue but has not yet been processed.<br>If a message shows a retrying status, Database Mail tried to send it but could not reach the SMTP server. Retry behavior depends on the settings for Account Retry Delay and Account Retry Attempts.<br>(<em>For configuration details, see the</em> <em><a href="https://learn.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/sysmail-configure-sp-transact-sql?view=sql-server-ver16">sysmail_configure_sp</a> stored procedure</em>.)</p>



<h3 class="wp-block-heading" id="h-view-failed-messages">View Failed Messages</h3>



<p>The <a href="https://msdn.microsoft.com/en-us/library/ms187747.aspx" target="_blank" rel="noreferrer noopener"><em>sysmail_faileditems</em></a> view is used to return only the messages with the <em>failed </em>status.  Use this view to determine which messages were not successfully sent by Database Mail and get message details to help you identify the nature of the problem. </p>



<p>To search for errors that are related to failed emails, use the <em>sysmail_faileditems</em> view to get the <em>mailitem_id</em> of the failed email, and then search for that <em>mailitem_id</em> in <a href="https://learn.microsoft.com/en-us/sql/relational-databases/system-catalog-views/sysmail-event-log-transact-sql?view=sql-server-ver16"><em>sysmail_event_log</em></a>.</p>



<p>Microsoft also offers this extensive list of <a href="https://learn.microsoft.com/en-us/troubleshoot/sql/tools/troubleshoot-database-mail-issues">Database Mail troubleshooting techniques</a>. </p>



<h3 class="wp-block-heading" id="h-view-email-attachments">View Email Attachments</h3>



<p>Use the <a href="https://msdn.microsoft.com/en-us/library/ms187954.aspx" target="_blank" rel="noreferrer noopener">sysmail_mailattachments</a> view to list each attachment sent by Database Mail and its properties, such as file name and file size. This view does not store the attachment content, but it lets you track and verify attachments associated with each email. To link attachments to specific emails, cross-reference the <em>mailitem_id</em> with the other Database Mail views listed above.</p>



<h2 class="wp-block-heading" id="h-take-control-of-your-database-mail">Take Control of Your Database Mail</h2>



<p>Don’t let email issues slow you down. Proactive SQL Server Database Mail troubleshooting keeps your environment stable and your team confident.</p>



<h2 class="wp-block-heading" id="h-tired-of-troubleshooting-sql-server-issues-alone">Tired of Troubleshooting SQL Server Issues Alone?</h2>



<p>Let The SERO Group help you keep your SQL Servers running smoothly. <a href="https://theserogroup.com/contact-us/" target="_blank" rel="noreferrer noopener">Schedule a quick, no-pressure discovery call</a> today.</p>
<p>The post <a href="https://theserogroup.com/sql-server/how-to-troubleshoot-sql-server-database-mail-issues-using-built-in-view/">How to Troubleshoot SQL Server Database Mail Issues Using Built-In View</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">7334</post-id>	</item>
		<item>
		<title>Why Tail-Log Backups Matter for SQL Server Recovery and Migration</title>
		<link>https://theserogroup.com/sql-server/why-tail-log-backups-matter-for-sql-server-recovery-and-migration/</link>
		
		<dc:creator><![CDATA[Luke Campbell]]></dc:creator>
		<pubDate>Wed, 16 Apr 2025 12:00:23 +0000</pubDate>
				<category><![CDATA[SQL Server]]></category>
		<category><![CDATA[Database]]></category>
		<category><![CDATA[Database Administration]]></category>
		<category><![CDATA[IT Manager]]></category>
		<category><![CDATA[Script Library]]></category>
		<category><![CDATA[SQL]]></category>
		<category><![CDATA[SQL Consultant]]></category>
		<category><![CDATA[SQL Events]]></category>
		<category><![CDATA[SQL Security]]></category>
		<category><![CDATA[SQL Server Management]]></category>
		<guid isPermaLink="false">https://theserogroup.com/?p=7302</guid>

					<description><![CDATA[<p>In previous posts, we&#8217;ve covered the more routine types of backups available within SQL Server — full, differential, and transaction log backups. While you may not use them as often, you should also be aware of tail-log backups when managing SQL Server. Tail-log backups can help in two scenarios. What is a tail-log backup? Tail-log&#8230; <br /> <a class="read-more" href="https://theserogroup.com/sql-server/why-tail-log-backups-matter-for-sql-server-recovery-and-migration/">Read more</a></p>
<p>The post <a href="https://theserogroup.com/sql-server/why-tail-log-backups-matter-for-sql-server-recovery-and-migration/">Why Tail-Log Backups Matter for SQL Server Recovery and Migration</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>In previous posts, we&#8217;ve covered the more routine types of backups available within SQL Server — full, differential, and transaction log backups. While you may not use them as often, you should also be aware of tail-log backups when managing SQL Server. Tail-log backups can help in two scenarios.</p>



<ol class="wp-block-list">
<li><strong>Recovering after a database outage</strong></li>



<li><strong>Database migrations</strong></li>
</ol>



<h2 class="wp-block-heading" id="h-what-is-a-tail-log-backup">What is a tail-log backup?</h2>



<p>Tail-log backups capture transaction log records that haven&#8217;t been backed up yet and set the database into a &#8220;restoring&#8221; state. Think of it as the last transaction log backup taken right before a database is restored, moved, or damaged. Its primary purpose is to capture the &#8220;tail&#8221; end of the log (any transactions that occurred since the last regular log backup to prevent data loss).</p>



<h2 class="wp-block-heading" id="h-when-are-tail-log-backups-important">When are tail-log backups important?</h2>



<h3 class="wp-block-heading" id="h-scenario-1-preventing-data-loss-after-a-failure">Scenario 1: Preventing data loss after a failure</h3>



<p>Imagine that you schedule transaction log backups to occur every 15 minutes. At 10:05 AM, the server hosting your database experiences a critical failure, taking the database offline. Your last log backup was at 10:00 AM. This means that transactions that have occurred between 10:00 AM and 10:05 AM would be lost without a tail-log backup.</p>



<p>The tail-log backup bridges this gap, capturing those final crucial log records and allowing you to restore the database to the exact point of failure. Depending on the damage, you may need to add additional options for this backup to succeed. See <a href="https://learn.microsoft.com/en-us/sql/relational-databases/backup-restore/tail-log-backups-sql-server?view=sql-server-ver16#TailLogScenarios" target="_blank" rel="noreferrer noopener">here</a> for those options.</p>



<h3 class="wp-block-heading" id="h-scenario-2-supporting-a-smooth-database-migration">Scenario 2: Supporting a smooth database migration</h3>



<p>If you&#8217;re migrating a database to a new instance, tail-log backups can be beneficial as well. Let&#8217;s say you have a migration coming up from SQL Server 2016 to a new SQL Server 2022 instance. You&#8217;ve pre-staged the database on the new instance by restoring the latest Full and have been applying transaction log backups periodically without recovery throughout the week leading up to migration day. On migration day, you need to ensure the following happens:</p>



<ol class="wp-block-list">
<li>All transactions are captured up to and right after the maintenance window begins.</li>



<li>Ensure the database on the old instance can no longer accept new connections.  You don&#8217;t want new transactions landing here by accident.</li>



<li>Restore the final transaction log backup on the new instance and recover the database.</li>
</ol>



<p>A tail-log backup can help in this scenario as well. You see, once you perform a tail-log backup, the database you performed it on is placed into the &#8220;<strong>restoring</strong>&#8221; state, which doesn&#8217;t allow connections. This ensures no future transactions can be written. Just in case a connection string wasn&#8217;t updated somewhere during the migration (I&#8217;d personally rather the connection fail vs. being able to connect to the database that is no longer in use). Once the migration is complete, including testing and validation, the old database can be dropped.</p>



<p>Let&#8217;s take a look at the requirements needed before you can utilize tail-log backups.</p>



<h2 class="wp-block-heading" id="h-requirements-for-tail-log-backups">Requirements for tail-log backups</h2>



<p>To perform a tail-log backup, the following conditions must be met:</p>



<h3 class="wp-block-heading" id="h-recovery-model">Recovery model</h3>



<p>The database must be in the <strong>FULL</strong> or <strong>BULK_LOGGED </strong>recovery model. Tail-log backups are not possible (or needed) in the <strong>SIMPLE </strong>recovery model, as the transaction log is automatically truncated.</p>



<h3 class="wp-block-heading" id="h-prior-full-backup">Prior full backup</h3>



<p>At least one full database backup must have been taken previously.  </p>



<h3 class="wp-block-heading" id="h-log-file-accessibility">Log file accessibility</h3>



<p>The transaction log file (.ldf) must be accessible and largely intact, even if the data files are damaged or the database is offline.</p>



<p>You can determine if your database meets the first two requirements by using a query similar to the one below.</p>



<pre class="wp-block-code"><code>
USE MSDB
SELECT distinct
	backupSet.&#91;Database_Name],  
	s.recovery_model_desc AS RecoveryModel,
	backupmediafamily.logical_device_name AS LogicalDeviceName, 
	backupmediafamily.physical_device_name AS PhysicalDeviceName, 
	backupset.expiration_date AS ExpirationDate, 
	backupset.name AS Name, 
	backupset.&#91;description] AS &#91;Description], 
	backupset.user_name AS UserName, 
	backupset.backup_start_date AS StartDate, 
	backupset.backup_finish_date AS EndDate, 
	DATEDIFF(mi, backupset.backup_start_date, backupset.backup_finish_date) AS DurationInMinutes,
	CAST(CASE backupset.type 
	WHEN 'D' THEN 'Database' 
	WHEN 'L' THEN 'Log' 
	WHEN 'I' THEN 'Differential' 
	WHEN 'F' THEN 'File' 
	WHEN 'G' THEN 'Diff File' 
	WHEN 'P' THEN 'Partial' 
	WHEN 'Q' THEN 'Diff Partial' 
	END AS NVARCHAR(128)) AS BackupType, 
	--backupset.compressed_backup_size / 1048576 , 
--	backupset.backup_size / 1048576,
	ISNULL(backupset.compressed_backup_size, backupset.backup_size) / 1048576 as SIZE,
	is_snapshot,
	is_copy_only,
	GetDate() AS DateChecked --, *
FROM msdb.dbo.backupmediafamily AS backupmediafamily
INNER JOIN msdb.dbo.backupset AS backupset ON backupmediafamily.media_set_id = backupset.media_set_id
INNER JOIN master.sys.databases as s ON backupset.database_name = s.name
WHERE     (CONVERT(datetime, backupset.backup_start_date, 102) &gt;= GETDATE() - 1)
AND backupset.server_name = @@servername  --Filters out databases that were restored from other instances.
AND backupSet.&#91;Database_Name] = 'YourDatabaseName'
AND backupset.type = 'D'
--AND backupSet.Type = 'D'
ORDER BY StartDate --DurationInMinutes DESC
</code></pre>



<h2 class="wp-block-heading" id="h-how-to-perform-a-tail-log-backup-high-level">How to perform a tail-log backup (high-level)</h2>



<p>The command is a variation of the standard <strong>BACKUP LOG</strong> statement. The key difference often lies in the options used, particularly <strong>WITH NORECOVERY </strong>or <strong>WITH NO_TRUNCATE</strong>.</p>



<h3 class="wp-block-heading" id="h-scenario-1-database-damaged-not-starting-log-file-intact">Scenario 1: Database damaged/not starting (log file intact)</h3>



<p>In this scenario, you may have lost a drive containing your data files (.mdf).  You&#8217;re lucky because you&#8217;ve been following the old adage of keeping your data files and log files on separate disks so the log file is available.  </p>



<p>If the database data files are damaged or missing, the database cannot start normally.  You can attempt a tail-log backup using <strong>WITH NO_TRUNCATE</strong>.  This tells SQL Server to back up the log records without trying to access the data files or truncate the inactive portion of the log, which might fail since the database is damaged.</p>



<pre class="wp-block-code"><code>BACKUP LOG ExampleDB
TO DISK = '\\YourBackupShare\Backup\ExampleDB_TailLog_NoTruncate.trn'
WITH NO_TRUNCATE;</code></pre>



<h3 class="wp-block-heading" id="h-scenario-2-planned-migration">Scenario 2: Planned migration</h3>



<p>In scenario 2, you&#8217;re migrating the database to a new instance and must ensure all transactions are captured.  I&#8217;ll typically switch the database into <strong>SINGLE_USER</strong> mode and kill all other connections when doing so.</p>



<pre class="wp-block-code"><code>ALTER DATABASE ExampleDB SET SINGLE_USER WITH ROLLBACK IMMEDIATE</code></pre>



<p>Perform this step once your migration window has started and any applications, scheduled tasks, jobs, etc. have been stopped.</p>



<p>Perform the tail-log backup.</p>



<pre class="wp-block-code"><code>BACKUP LOG ExampleDB
TO DISK = '\\YourBackupShare\Backup\ExampleDB_TailLog.trn'
WITH NORECOVERY;</code></pre>



<p>Here&#8217;s what happens when you run this backup.</p>



<ol class="wp-block-list">
<li>The final log records are backed up to ExampleDB_TailLog.trn.</li>



<li>The ExampleDB database is put into the RESTORING state.</li>



<li>No further transactions can occur in the original ExampleDB database.</li>



<li>You can now proceed to restore any subsequent differential/log backups that you haven&#8217;t restored yet and, finally, this tail-log backup (using WITH RECOVERY) on the target server.</li>



<li>Once restored and recovered, don&#8217;t forget to place ExampleDB into MULTI_USER mode on the target server.</li>
</ol>



<h2 class="wp-block-heading" id="h-tail-log-backup-process-checklist">Tail-log backup process checklist</h2>



<p>Here&#8217;s a quick checklist for performing a planned tail-log backup (like for a migration):</p>



<ul class="wp-block-list">
<li><strong>Verify Recovery Model: </strong>Ensure the database is in FULL or BULK_LOGGED mode.</li>



<li><strong>Check Backup History: </strong>Confirm a recent full backup exists.  Regular log backups should be running.</li>



<li><strong>Notify Users: </strong>Inform users of the planned downtime. Work with all other teams that depend on the database to determine a sufficient maintenance window for the migration. They&#8217;ll need to update connection strings, scheduled jobs, SSIS packages, etc., to point to the new database location.</li>



<li><strong>Restrict Access: </strong>Prevent new connections/transactions just before the backup.  </li>



<li>Execute <strong>BACKUP LOG &#8230; WITH NORECOVERY: </strong>Run the command, specifying a clear path and filename.</li>



<li><strong>Verify Backup File</strong>: Ensure the .trn file was created successfully.</li>



<li><strong>Confirm Database State: </strong>Check that the source database is now in the <strong>RESTORING </strong>state.</li>



<li><strong>Proceed with Restore: </strong>Use the tail-log backup as the final restore step on the target server or for recovery.</li>



<li><strong>Set database to MULTI_USER: </strong>If you&#8217;ve placed the source database in <strong>SINGLE_USER</strong> mode just prior to performing the tail-log backup, the restored database on the target will be in <strong>SINGLE_USER</strong> mode as well.  To allow connections, be sure to switch it to <strong>MULTI_USER</strong>.</li>
</ul>



<h2 class="wp-block-heading" id="h-in-conclusion">In conclusion</h2>



<p>While often overlooked in basic backup discussions, the tail-log backup is a vital tool in the SQL Server DBA&#8217;s toolkit.  It provides the critical ability to capture the very last transactions before a database restore or migration, minimizing data loss and ensuring the most up-to-date recovery possible.  Understanding when and how to use it is key to robust data protection and seamless database migrations.</p>



<h2 class="wp-block-heading" id="h-want-to-work-with-the-sero-group">Want to work with The SERO Group?</h2>



<p>Want to learn more about how The SERO Group helps organizations take the guesswork out of managing their SQL Servers? <a href="https://theserogroup.com/contact-us/" target="_blank" rel="noreferrer noopener">Schedule a no-obligation discovery call</a>&nbsp;with us to get started.</p>
<p>The post <a href="https://theserogroup.com/sql-server/why-tail-log-backups-matter-for-sql-server-recovery-and-migration/">Why Tail-Log Backups Matter for SQL Server Recovery and Migration</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">7302</post-id>	</item>
		<item>
		<title>The Costs of Undermanaged SQL Servers for Financial Institutions</title>
		<link>https://theserogroup.com/sql-server/the-costs-of-undermanaged-sql-servers-for-financial-institutions/</link>
		
		<dc:creator><![CDATA[Joe Webb]]></dc:creator>
		<pubDate>Wed, 09 Apr 2025 12:00:53 +0000</pubDate>
				<category><![CDATA[SQL Server]]></category>
		<category><![CDATA[Clusters]]></category>
		<category><![CDATA[Database]]></category>
		<category><![CDATA[Database Administration]]></category>
		<category><![CDATA[Database Development]]></category>
		<category><![CDATA[IT Manager]]></category>
		<category><![CDATA[Microsoft Azure]]></category>
		<category><![CDATA[Public Speaking]]></category>
		<category><![CDATA[Script Library]]></category>
		<category><![CDATA[Sero]]></category>
		<category><![CDATA[Sero Group]]></category>
		<category><![CDATA[Serogroup]]></category>
		<category><![CDATA[SQL]]></category>
		<category><![CDATA[SQL Assessment]]></category>
		<category><![CDATA[SQL Audit]]></category>
		<category><![CDATA[SQL Security]]></category>
		<category><![CDATA[SQL Server Management]]></category>
		<guid isPermaLink="false">https://theserogroup.com/?p=7299</guid>

					<description><![CDATA[<p>Banks and credit unions rely on SQL Server databases to power transactions, portals, reporting, fraud detection, and core systems. Despite this, many institutions end up undermanaging or even overlooking these critical systems. The result? Performance lags, security vulnerabilities, and unplanned downtime that can cost far more than most institutions realize. If you’re responsible for operational&#8230; <br /> <a class="read-more" href="https://theserogroup.com/sql-server/the-costs-of-undermanaged-sql-servers-for-financial-institutions/">Read more</a></p>
<p>The post <a href="https://theserogroup.com/sql-server/the-costs-of-undermanaged-sql-servers-for-financial-institutions/">The Costs of Undermanaged SQL Servers for Financial Institutions</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Banks and credit unions rely on SQL Server databases to power transactions, portals, reporting, fraud detection, and core systems. Despite this, many institutions end up undermanaging or even overlooking these critical systems. The result? Performance lags, security vulnerabilities, and unplanned downtime that can cost far more than most institutions realize. </p>



<p>If you’re responsible for operational stability, data integrity, or risk, it’s essential to understand the impact of undermanaged SQL Servers. Read on to discover the biggest issues—and how to prevent them <em>before </em>they cause problems. We&#8217;ll also show you how to build a strong business case for proactive SQL Server management.</p>



<h2 class="wp-block-heading" id="h-three-biggest-liabilities-of-undermanaged-sql-servers">Three Biggest Liabilities of Undermanaged SQL Servers</h2>



<h3 class="wp-block-heading" id="h-1-downtime-is-costly-and-more-common-than-you-may-think"><strong>1. Downtime Is Costly—and More Common Than You May Think</strong></h3>



<p>Let’s start with the hard numbers. In the financial services industry, the average cost of IT downtime is estimated at <a href="https://agio.com/cost-of-downtime-for-investment-management-leaders/" target="_blank" rel="noreferrer noopener">$9,000 per minute for larger institutions</a>. While community banks and credit unions may not hit that number, even smaller outages can disrupt transaction processing, customer support, and access to critical data—leading to lost revenue and reputational damage.</p>



<p><a href="https://www.infosecurity-magazine.com/news/destructive-attacks-banks-surge-13/?utm_source=chatgpt.com" target="_blank" rel="noreferrer noopener">Infosecurity Magazine recently reported a study from Contrast Security</a> that indicated that over half (54%) of global financial institutions experienced cyberattacks in the past year where data was destroyed by adversaries. </p>



<p>More tellingly, a <a href="https://datacenter.uptimeinstitute.com/rs/711-RIA-145/images/AnnualOutageAnalysis2023.03092023.pdf" target="_blank" rel="noreferrer noopener">2023 Uptime Institute report</a> found that over one-third of data center outages across all industries stemmed from system and software issues—many of which are database-related. These are not rare events. They’re happening every day in organizations that don’t have a dedicated plan for monitoring and managing their SQL Servers.</p>



<h3 class="wp-block-heading" id="h-2-security-threats-are-rising-and-databases-are-a-target"><strong>2. Security Threats Are Rising—and Databases Are a Target</strong></h3>



<p>As financial institutions increase their digital footprint, SQL Servers become even more attractive to cybercriminals. In 2024, the average cost of a data breach in the financial sector rose to $6.08 million, <a href="https://bankingjournal.aba.com/2024/08/report-average-data-breach-cost-for-financial-sector-tops-6m/" target="_blank" rel="noreferrer noopener">according to industry research reported in the ABA Banking Journal</a>. That’s a 22% premium over the global average, reflecting the high value of financial data and the regulatory scrutiny that follows a breach.</p>



<p>Unpatched SQL Server instances, misconfigured access controls, and lack of encryption are all common vulnerabilities in unmanaged environments. Bad actors know this, and they exploit it.</p>



<p>Without regular audits, patching schedules, and proactive security monitoring, your institution could be one missed update away from its next major incident.</p>



<h3 class="wp-block-heading" id="h-3-performance-issues-impact-productivity-and-customer-experience"><strong>3. Performance Issues Impact Productivity and Customer Experience</strong></h3>



<p>An unmanaged SQL Server environment doesn’t just create security risks; it can slow down your business. Query bottlenecks, deadlocks, resource contention, and stale indexing strategies can cripple performance over time.</p>



<p>For your internal teams, this means longer wait times for reports and slower access to operational systems. For customers, it can mean delays in processing payments, loan applications, or online transactions. </p>



<p>Each delay has the potential to damage your reputation with your customers and to frustrate your team. Aren&#8217;t you tired of hearing, “I’m sorry, my computer is just slow today,” when you’re trying to get something done over the phone?</p>



<h2 class="wp-block-heading" id="h-there-s-real-roi-in-proactive-sql-server-management">There’s Real ROI in Proactive SQL Server Management</h2>



<p>The risks and the costs are clear. But that&#8217;s not the end of the story. </p>



<p>The upside is just as compelling. Here are some key benefits of a properly managed SQL Server estate.</p>



<h3 class="wp-block-heading" id="h-enhanced-performance">Enhanced Performance</h3>



<ul class="wp-block-list">
<li><strong>Optimized Queries:</strong>&nbsp;Properly tuned SQL queries execute faster, leading to quicker application response times and improved user experience.&nbsp;</li>



<li><strong>Efficient Resource Utilization:</strong>&nbsp;Monitoring and managing resources like CPU, memory, and disk I/O prevent bottlenecks and ensure optimal performance.&nbsp;</li>



<li><strong>Database Optimization:</strong>&nbsp;Indexing, partitioning, and other optimization techniques improve data access speed and reduce query execution time.&nbsp;</li>
</ul>



<h3 class="wp-block-heading" id="h-reduced-costs">Reduced Costs:</h3>



<ul class="wp-block-list">
<li><strong>Resource Optimization:</strong>&nbsp;By identifying and addressing performance issues, you can optimize resource utilization and potentially reduce hardware costs.&nbsp;</li>



<li><strong>Lower Downtime:</strong>&nbsp;Proactive maintenance and monitoring minimize downtime, reducing business disruption and associated costs.&nbsp;</li>



<li><strong>Improved Security:</strong>&nbsp;Strong security measures prevent data breaches and compliance issues, which can result in significant financial penalties.&nbsp;</li>
</ul>



<h3 class="wp-block-heading" id="h-improved-data-management">Improved Data Management:</h3>



<ul class="wp-block-list">
<li><strong>Data Integrity:</strong>&nbsp;Proper backup and recovery procedures ensure data integrity and prevent data loss.&nbsp;</li>



<li><strong>Compliance:</strong>&nbsp;Meeting regulatory requirements and industry standards reduces the risk of penalties and legal issues.&nbsp;</li>



<li><strong>Data-Driven Decisions:</strong>&nbsp;Access to accurate and timely data enables better decision-making and improved business outcomes.&nbsp;</li>
</ul>



<h2 class="wp-block-heading" id="h-what-proactive-sql-server-management-looks-like"><strong>What Proactive SQL Server Management Looks Like</strong></h2>



<p>Proactive management isn’t just about reacting to alerts—it’s about preventing problems before they start. This includes:</p>



<ul class="wp-block-list">
<li>Proactive Daily Health Checks </li>



<li>Regularly reviewing SQL Server and Windows log files</li>



<li>Real-time monitoring and alerting</li>



<li>Regular performance tuning and index optimization</li>



<li>Patch management</li>



<li>Vulnerability scanning</li>



<li>Secure access controls and encryption enforcement</li>



<li>Backup validation and disaster recovery planning</li>



<li>Monthly or quarterly health checks and reporting</li>



<li>Annual <a href="https://www.cisecurity.org/" target="_blank" rel="noreferrer noopener">Center for Internet Security (CIS)</a> Benchmark Assessments</li>
</ul>



<p>Whether you manage your servers in-house or partner with a specialized team like The SERO Group, having a defined strategy can reduce your operational risk while maximizing the value of your technology investments.</p>



<h2 class="wp-block-heading" id="h-the-bottom-line-proactive-sql-server-management-is-a-sound-investment"><strong>The Bottom Line: Proactive SQL Server Management is a Sound Investment</strong></h2>



<p>Your SQL Servers are too important to be treated as set-it-and-forget-it infrastructure. As regulatory pressures grow and customer expectations rise, your institution needs systems that are healthy, secure, and reliable.</p>



<p>By investing in professional SQL Server management, financial institutions can reduce downtime, strengthen cybersecurity, and improve performance—while freeing internal teams to focus on their primary duties.</p>



<p>Don’t wait for a breach or a breakdown to take action. Make your SQL Server estate&#8217;s health and resilience a strategic priority. </p>



<p>Want to learn more about how The SERO Group helps financial institutions keep their SQL Servers healthy, secure, and reliable? <a href="https://theserogroup.com/contact-us/" target="_blank" rel="noreferrer noopener">Schedule a no-obligation discovery call</a>. </p>



<p>Learn more about our: </p>



<ul class="wp-block-list">
<li><a href="https://theserogroup.com/sql-server-cis-benchmarks-assessment/">SQL Server CIS® Benchmarks™ Assessment</a></li>



<li><a href="https://theserogroup.com/sql-health-check/">SQL Server Health Check</a></li>
</ul>
<p>The post <a href="https://theserogroup.com/sql-server/the-costs-of-undermanaged-sql-servers-for-financial-institutions/">The Costs of Undermanaged SQL Servers for Financial Institutions</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">7299</post-id>	</item>
		<item>
		<title>How to Encrypt Sensitive Text in SQL Server with ENCRYPTBYPASSPHRASE</title>
		<link>https://theserogroup.com/data-security/how-to-encrypt-sensitive-text-in-sql-server-with-encryptbypassphrase/</link>
		
		<dc:creator><![CDATA[Eric Cobb]]></dc:creator>
		<pubDate>Wed, 23 Oct 2024 12:00:00 +0000</pubDate>
				<category><![CDATA[Data Security]]></category>
		<category><![CDATA[Data Strategy]]></category>
		<category><![CDATA[DBA]]></category>
		<category><![CDATA[SQL Server]]></category>
		<category><![CDATA[Database]]></category>
		<category><![CDATA[Database Administration]]></category>
		<category><![CDATA[Database Development]]></category>
		<category><![CDATA[Script Library]]></category>
		<category><![CDATA[SQL]]></category>
		<category><![CDATA[SQL Security]]></category>
		<guid isPermaLink="false">https://theserogroup.com/?p=6786</guid>

					<description><![CDATA[<p>Storing sensitive information in a database, like passwords or social security numbers, is common practice. However, storing them securely is less common. Unfortunately, one of the most typical approaches is to store sensitive information in a table as clear text. That means that anyone with access to that table can see all of that sensitive&#8230; <br /> <a class="read-more" href="https://theserogroup.com/data-security/how-to-encrypt-sensitive-text-in-sql-server-with-encryptbypassphrase/">Read more</a></p>
<p>The post <a href="https://theserogroup.com/data-security/how-to-encrypt-sensitive-text-in-sql-server-with-encryptbypassphrase/">How to Encrypt Sensitive Text in SQL Server with ENCRYPTBYPASSPHRASE</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Storing sensitive information in a database, like passwords or social security numbers, is common practice. However, storing them securely is less common. Unfortunately, one of the most typical approaches is to store sensitive information in a table as clear text. That means that anyone with access to that table can see all of that sensitive data.</p>



<p>Just to be clear, storing sensitive information as a clear text string is a really, really,&nbsp;<em>really</em>&nbsp;bad idea.</p>



<p>Not encrypting information in a database can cause serious problems. As just one example, if the database is compromised, all user passwords could be exposed. Data breaches are becoming more and more common. If the authorities come knocking on your door, you need to be able to show them that you at least made a concerned effort to protect that data.</p>



<h2 class="wp-block-heading" id="h-encrypting-text-that-will-need-to-be-decrypted">Encrypting text that will need to be decrypted</h2>



<p>In some cases, you may be able to store your sensitive data as strongly encrypted text that will never need to be decrypted. For example, hashing a password used for your application login and then just comparing the hashed password for the login instead of the actual password. But, in most cases, being able to decrypt the sensitive data is going to be necessary.</p>



<p>In these cases,  <a href="https://docs.microsoft.com/en-us/sql/t-sql/functions/encryptbypassphrase-transact-sql" target="_blank" rel="noreferrer noopener">ENCRYPTBYPASSPHRASE</a> (available in SQL Server 2008 and up) offers one of the simplest ways for you to encrypt sensitive information in a way that can also be decrypted (by using <a href="https://docs.microsoft.com/en-us/sql/t-sql/functions/decryptbypassphrase-transact-sql" target="_blank" rel="noreferrer noopener">DECRYPTBYPASSPHRASE</a>). At its very basic, ENCRYPTBYPASSPHRASE requires two mandatory arguments: a passphrase used to generate the encryption key and the text to be encrypted.  Notice that it specifies a pass<strong><em>phrase</em></strong>, not pass<strong><em>word</em></strong>. There is an important difference between these two.</p>



<h2 class="wp-block-heading" id="h-a-passphrase-vs-a-password">A passphrase vs. a password</h2>



<p>As described in the ENCRYPTBYPASSPHRASE documentation:&nbsp;</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p><em>A passphrase is a password that includes spaces. The advantage of using a passphrase is that it is easier to remember a meaningful phrase or sentence than to remember a comparably long string of characters.</em></p>
</blockquote>
</blockquote>



<p>Many people don’t realize that you can use a space as a legitimate special character in most passwords. By doing this, you can generate a much more secure password sentence (or phrase) instead of a single word. An example of a passphrase may be something like “I forgot my password!”</p>



<p>Just to be clear, a space is <strong><em>not </em></strong>required in your passphrase for ENCRYPTBYPASSPHRASE. If you wanted to use a GUID for your passphrase or a random string such as “Zgt9$Ex%*unZO8Z},” that is perfectly acceptable.</p>



<h2 class="wp-block-heading" id="h-using-nbsp-encryptbypassphrase">Using&nbsp;ENCRYPTBYPASSPHRASE</h2>



<p>For the examples in this post, I am going to use the encryption passphrase “This is my Passphrase!”, and the text to be encrypted is “ABC123”.</p>



<p>The basic syntax is:<br>ENCRYPTBYPASSPHRASE(‘encryption passphrase’, ‘text to encrypt’)</p>



<p>There are other arguments that can be used with&nbsp;ENCRYPTBYPASSPHRASE (see&nbsp;<a href="https://docs.microsoft.com/en-us/sql/t-sql/functions/encryptbypassphrase-transact-sql" target="_blank" rel="noreferrer noopener">MSDN Doc</a>), but for this simple example we are just using the&nbsp;two mandatory arguments.</p>



<p>To view the encrypted value of the text “ABC123”, you would use this script:</p>



<pre class="wp-block-code"><code>SELECT ENCRYPTBYPASSPHRASE(N'This is my Passphrase!', N'ABC123');</code></pre>



<p>That SELECT statement will return a&nbsp;VARBINARY value such as:&nbsp;<em>0x0100000093EEC20B790EF208B1FB631F0AB3028E3A8C196643C4BD578528A0DFAE7AB45B</em></p>



<p>It is important to note that the VARBINARY value returned from ENCRYPTBYPASSPHRASE is <a href="https://en.wikipedia.org/wiki/Nondeterministic_algorithm" target="_blank" rel="noreferrer noopener">nondeterministic</a>, meaning that even with the same input it will not generate the same output every time.  So you can run the exact same SELECT statement multiple times and get a different result each time.</p>



<p>Thankfully, this output has no bearing on using the DECRYPTBYPASSPHRASE function. As long as you have the correct passphrase, DECRYPTBYPASSPHRASE will successfully decrypt any of those VARBINARY results to their original value.</p>



<h2 class="wp-block-heading" id="h-storing-an-encrypted-value-in-a-table">Storing an encrypted value in a table</h2>



<p>Now that we know how to encrypt a sensitive text string, let’s take a look at how to store that encrypted value in a table. &nbsp;Since the value returned from&nbsp;ENCRYPTBYPASSPHRASE is a&nbsp;VARBINARY data type, that is how we want to store it since this is also the data type required by DECRYPTBYPASSPHRASE.</p>



<p>The first thing we need to do is determine the size of our encrypted column in our table. The VARBINARY values returned by ENCRYPTBYPASSPHRASE can vary in size, with maximum size of 8,000 bytes. The size of the returned value is going to depend on the size of the actual text being encrypted. You can use the <a href="https://docs.microsoft.com/en-us/sql/t-sql/functions/datalength-transact-sql" target="_blank" rel="noreferrer noopener">DATALENGTH</a> function to help figure that out. If you have a way to control the maximum allowed length of the sensitive text value you want to encrypt, use that size for your table column, but try not to use VARBINARY(8000) if you don’t have to.</p>



<p>Here is a simple example of storing our encrypted text in the [Password] column of a table:</p>



<pre class="wp-block-code"><code>CREATE TABLE dbo.Users (&#91;UserName] VARCHAR(50), &#91;Password] VARBINARY(50))
 
INSERT INTO dbo.Users (&#91;UserName], &#91;Password])
VALUES ('Charlie Brown', ENCRYPTBYPASSPHRASE(N'This is my Passphrase!', N'ABC123'))
 
SELECT &#91;UserName], &#91;Password]
FROM dbo.Users</code></pre>



<h2 class="wp-block-heading" id="h-using-decryptbypassphrase">Using DECRYPTBYPASSPHRASE</h2>



<p>Now that we have our sensitive text encrypted, we need to be able to decrypt it as well. &nbsp;This is easily done by using the&nbsp;DECRYPTBYPASSPHRASE function with the same passphrase we encrypted our text string with. However, DECRYPTBYPASSPHRASE also returns a VARBINARY value, which we will have to convert to a string. &nbsp;This can be done by adding a CONVERT function to our SELECT statement.</p>



<pre class="wp-block-code"><code>SELECT &#91;UserName], CONVERT(NVARCHAR, DECRYPTBYPASSPHRASE(N'This is my Passphrase!', &#91;Password]))
FROM dbo.Users</code></pre>



<p>Now you should see your decrypted value returned correctly in clear text. ENCRYPTBYPASSPHRASE offers a quick and easy way for you to encrypt text in SQL Server and can be useful for encrypting sensitive information if you need to be able to decrypt it later. </p>



<h2 class="wp-block-heading" id="h-want-to-work-with-the-sero-group">Want to work with The SERO Group?</h2>



<p>Want to learn more about how The SERO Group helps organizations take the guesswork out of managing their SQL Servers? <a href="https://theserogroup.com/contact-us/" target="_blank" rel="noreferrer noopener">Schedule a no-obligation discovery call</a>&nbsp;with us to get started.</p>
<p>The post <a href="https://theserogroup.com/data-security/how-to-encrypt-sensitive-text-in-sql-server-with-encryptbypassphrase/">How to Encrypt Sensitive Text in SQL Server with ENCRYPTBYPASSPHRASE</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">6786</post-id>	</item>
		<item>
		<title>Prevent SQL Server Outages by Monitoring Transaction Log Growth</title>
		<link>https://theserogroup.com/sql-server/prevent-sql-server-outages-by-monitoring-transaction-log-growth/</link>
		
		<dc:creator><![CDATA[Luke Campbell]]></dc:creator>
		<pubDate>Wed, 09 Oct 2024 12:00:00 +0000</pubDate>
				<category><![CDATA[SQL Server]]></category>
		<category><![CDATA[Database]]></category>
		<category><![CDATA[Database Administration]]></category>
		<category><![CDATA[IT Manager]]></category>
		<category><![CDATA[Script Library]]></category>
		<category><![CDATA[SQL]]></category>
		<category><![CDATA[SQL Server Consultant]]></category>
		<category><![CDATA[SQL Server Management]]></category>
		<guid isPermaLink="false">https://theserogroup.com/?p=6725</guid>

					<description><![CDATA[<p>I&#8217;ve lost count of the times I&#8217;ve been called after hours due to a drive filling up. The usual culprit? Transaction file log growth. Monitoring the growth of your SQL Server transaction log files is crucial for maintaining database performance and ensuring system reliability. Unchecked transaction log growth can lead to disk space issues and&#8230; <br /> <a class="read-more" href="https://theserogroup.com/sql-server/prevent-sql-server-outages-by-monitoring-transaction-log-growth/">Read more</a></p>
<p>The post <a href="https://theserogroup.com/sql-server/prevent-sql-server-outages-by-monitoring-transaction-log-growth/">Prevent SQL Server Outages by Monitoring Transaction Log Growth</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>I&#8217;ve lost count of the times I&#8217;ve been called after hours due to a drive filling up.  The usual culprit?  Transaction file log growth.</p>



<p>Monitoring the growth of your SQL Server transaction log files is crucial for maintaining database performance and ensuring system reliability.  Unchecked transaction log growth can lead to disk space issues and system outages.  Let&#8217;s take a look at some of the causes of transaction log file growth, why it&#8217;s important to monitor, and how you can use the sp_whoisactive procedure to keep an eye on it.</p>



<h2 class="wp-block-heading" id="h-what-causes-the-transaction-log-file-to-grow">What Causes the Transaction Log File to Grow</h2>



<p>The transaction log file records all transactions and database modifications made by each transaction.  Several factors can contribute to its growth:</p>



<ul class="wp-block-list">
<li><strong>Long-Running Transactions: </strong>Transactions that take a long time to complete prevent the log from truncating because SQL Server needs to maintain the log records until the transaction is committed.</li>



<li><strong>Recovery Models: </strong>
<ul class="wp-block-list">
<li><strong>Full Recovery Model: </strong>In this model, the transaction log will not truncate until a log backup occurs.  If log backups are not taken regularly, the log file will grow indefinitely.</li>



<li><strong>Bulk-Logged Recovery Model: </strong>Similar to the full recovery model, it is used for bulk operations. Log growth can still be an issue if backups aren&#8217;t scheduled appropriately.</li>



<li><strong>Simple Recovery Model: </strong>Although SQL Server automatically truncates the log in this recovery model, you must still watch for transactions that are making large changes all at once. For instance, if you update millions of rows in one transaction, this could be enough to cause the log file to grow.</li>
</ul>
</li>



<li><strong>Lack of Log Backups: </strong>Without regular log backups in the full or bulk-logged recovery models, the log cannot truncate, leading to continuous growth.</li>



<li><strong>High Transaction Volume: </strong>A sudden spike in transaction activity can cause the log file to grow rapidly to accommodate the increased volume.</li>



<li><strong>Replication and Availability Groups: </strong>Features like availability groups and replication can delay log truncation until the data is sent to the replicas or replicated databases.  Seeing the log file continue to grow even after taking a log backup?  Check the l<strong>og_reuse_wait_desc </strong>column in <strong>sys.databases</strong>.  It&#8217;ll indicate why.</li>



<li><strong>Open Transactions</strong>: Uncommitted transactions hold on to log space because SQL Server needs the log records to roll backup if necessary. Who hasn&#8217;t started their session with &#8220;begin transaction&#8221; and gone home for the weekend (forgetting the commit)?</li>
</ul>



<h2 class="wp-block-heading" id="h-why-it-matters">Why it Matters</h2>



<p>The impact can be massive, from crashing applications to filling up disk space and affecting all other databases on the same drive.</p>



<ul class="wp-block-list">
<li><strong>Disk Space Consumption: </strong>Uncontrolled log growth can consume all available disk space, leading to application failures and downtime.</li>



<li><strong>Performance Degradation: </strong>Large log files can slow down database operations, including backups and recovery processes, affecting overall system performance.</li>



<li><strong>Recovery Time Objectives (RTOs): </strong>In disaster recovery scenarios, large transaction logs can increase the time it takes to restore databases, impacting business continuity.</li>



<li><strong>Maintenance Challenges: </strong>Managing and maintaining oversized log files can be cumbersome, requiring more time for routine operations like backups.</li>



<li><strong>Risk</strong> <strong>of Data Loss: </strong>If the disk runs out of space due to log growth, new transactions cannot be logged, leading to potential data loss or corruption. </li>
</ul>



<h2 class="wp-block-heading" id="h-how-to-monitor-it-using-sp-whoisactive">How to monitor it using sp_WhoIsActive</h2>



<p>sp_whoisactive is a powerful stored procedure developed by Adam Machanic that provides a comprehensive view of the current activity on your SQL Server.  It can be used to monitor transaction log growth.</p>



<p>Here&#8217;s how you can use it:</p>



<h3 class="wp-block-heading" id="h-installation">Installation</h3>



<p>First, download the script from <a href="https://github.com/amachanic/sp_whoisactive">https://github.com/amachanic/sp_whoisactive</a> and run it on your instance. I usually place it in a database named DBA vs. master.</p>



<h3 class="wp-block-heading" id="h-monitoring-active-sessions"><strong>Monitoring Active Sessions</strong></h3>



<p>Run the procedure to see all active sessions:</p>



<pre class="wp-block-code"><code>USE DBA
GO
EXEC sp_whoisactive</code></pre>



<h3 class="wp-block-heading" id="h-identifying-long-running-transactions"><strong>Identifying Long-Running Transactions</strong></h3>



<p>To focus on transactions that might be causing log growth, look for sessions with high values in the <em>tran_log_writes<strong> </strong></em>column, which shows the number of log records written by the session. You&#8217;ll need to set the @get_transaction_info to 1 to include transaction details.</p>



<pre class="wp-block-code"><code>EXEC sp_whoisactive @get_transaction_info = 1

</code></pre>



<p>This will add columns related to transaction duration and log usage.</p>



<h3 class="wp-block-heading" id="h-ordering-by-log-writes"><strong>Ordering by log writes</strong></h3>



<p>If you&#8217;re interested in sessions consuming the most log space, order the results accordingly:</p>



<pre class="wp-block-code"><code>EXEC sp_whoisactive @get_transaction_info = 1, @sort_order = '&#91;tran_log_writes] DESC'
</code></pre>



<h3 class="wp-block-heading" id="h-automating-monitoring"><strong>Automating Monitoring</strong></h3>



<p>Schedule sp_whoisactive to run at regular intervals and log the output to a table for historical analysis or to send an alert if certain thresholds are met.  This can help you identify patterns over time (i.e., Mike usually starts a new session every Friday evening and leaves for the weekend.). The following code will create a table, <em>dbo.WhoIsActiveLog<strong>, </strong></em>with the appropriate schema.  The second statement will then execute sp_whoisactve and store the results within this table.</p>



<pre class="wp-block-code"><code>--create the dbo.WhoIsActiveLog table<br>DECLARE @schemaDefinition VARCHAR(MAX)<br>EXEC sp_WhoIsActive <br>    @find_block_leaders = 1, <br>	@get_transaction_info = 1,<br>	@get_locks = 1,<br>	@return_schema = 1,<br>	@schema = @schemaDefinition OUTPUT<br><br>SELECT @schemaDefinition =  REPLACE(@schemaDefinition,'&lt;table_name&gt;','dbo.WhoIsActiveLog')<br>EXEC (@schemaDefinition)<br>GO<br>EXEC sp_WhoIsActive <br>    @find_block_leaders = 1, <br>	@get_transaction_info = 1,<br>	@get_locks = 1,<br>	@destination_table = 'dbo.WhoIsActiveLog'<br><br></code></pre>



<p>Once you identify sessions causing excessive log growth, you can investigate the underlying queries or processes:</p>



<ul class="wp-block-list">
<li><strong>Optimize Queries: </strong>Rewrite inefficient queries to reduce transaction time.</li>



<li><strong>Adjust Recovery Models: </strong>If appropriate, and point-in-time recovery isn&#8217;t required, consider switching to the simple recovery model.  However, this won&#8217;t help if it&#8217;s a long running transaction making many changes simultaneously.</li>



<li><strong>Schedule Regular Backups: </strong>Ensure that log backups are taken frequently to prevent log file bloat.</li>



<li><strong>Kill Problematic Sessions: </strong>As a last resort, terminate sessions that are causing issues, but be cautious of potential data loss.</li>
</ul>



<h2 class="wp-block-heading" id="h-want-to-work-with-the-sero-group">Want to work with The SERO Group?</h2>



<p>Want to learn more about how The SERO Group helps organizations take the guesswork out of managing their SQL Servers? <a href="https://theserogroup.com/contact-us/" target="_blank" rel="noreferrer noopener">Schedule a no-obligation discovery call</a>&nbsp;with us to get started.</p>
<p>The post <a href="https://theserogroup.com/sql-server/prevent-sql-server-outages-by-monitoring-transaction-log-growth/">Prevent SQL Server Outages by Monitoring Transaction Log Growth</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">6725</post-id>	</item>
		<item>
		<title>Useful Scripts For SQL Server Logins and Permissions</title>
		<link>https://theserogroup.com/dba/useful-scripts-for-sql-server-logins-and-permissions/</link>
		
		<dc:creator><![CDATA[Eric Cobb]]></dc:creator>
		<pubDate>Wed, 25 Sep 2024 12:00:00 +0000</pubDate>
				<category><![CDATA[DBA]]></category>
		<category><![CDATA[Database]]></category>
		<category><![CDATA[Database Administration]]></category>
		<category><![CDATA[Database Development]]></category>
		<category><![CDATA[Script Library]]></category>
		<category><![CDATA[SQL Security]]></category>
		<category><![CDATA[SQL Server]]></category>
		<category><![CDATA[SQL Server Management]]></category>
		<guid isPermaLink="false">https://theserogroup.com/?p=6676</guid>

					<description><![CDATA[<p>Since security and permissions are a big part of a DBA’s job, it&#8217;s important to be able to find out things like who has elevated login permissions or when a login was last used. Here are a few queries to help you check your server and database access. Most of these scripts are based off&#8230; <br /> <a class="read-more" href="https://theserogroup.com/dba/useful-scripts-for-sql-server-logins-and-permissions/">Read more</a></p>
<p>The post <a href="https://theserogroup.com/dba/useful-scripts-for-sql-server-logins-and-permissions/">Useful Scripts For SQL Server Logins and Permissions</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Since security and permissions are a big part of a DBA’s job, it&#8217;s important to be able to find out things like who has elevated login permissions or when a login was last used. Here are a few queries to help you check your server and database access. Most of these scripts are based off of sys.dm_exec_sessions, a DMV that shows information about all active user connections. </p>



<p>I should also mention that the results of these queries will <strong>only go back to the last SQL Server start time</strong>. Anything that occurred before then won&#8217;t be available.</p>



<h3 class="wp-block-heading" id="h-logins-vs-users">Logins Vs. Users</h3>



<p>First, let&#8217;s take a look at the difference between “logins” and “users” in SQL Server, as people sometimes get them confused, or think they&#8217;re the same thing. A “login” allows access to a SQL Server instance. A “user” allows access to a specific database on that instance. Usually, a user is tied to a login, although you can have a user that is not tied to a login (known as a loginless user).</p>



<h3 class="wp-block-heading" id="h-when-was-the-last-time-a-login-was-used">When was the last time a login was used?</h3>



<pre class="wp-block-code"><code>--list of logins and last time each logged in
SELECT &#91;Login] = login_name 
	,&#91;Last Login Time] = MAX(login_time)
FROM sys.dm_exec_sessions
GROUP BY &#91;login_name];</code></pre>



<h3 class="wp-block-heading" id="h-which-logins-have-logged-in-within-the-last-x-hours">Which logins have logged in within the last X hours?</h3>



<pre class="wp-block-code"><code>--all logins in the last 4 hours
SELECT &#91;Login] = login_name
	,&#91;Last Login Time] = login_time
	,&#91;Host] = HOST_NAME
	,&#91;Program] = PROGRAM_NAME
	,&#91;Client Interface] =  client_interface_name
	,&#91;Database] = DB_NAME(database_id)
FROM sys.dm_exec_sessions
WHERE &#91;login_time] &gt; DATEADD(HH,-4,getdate())--modify date as needed
ORDER BY &#91;login_time] desc</code></pre>



<h3 class="wp-block-heading" id="h-how-many-times-has-each-login-logged-in-within-the-last-x-hours">How many times has each login logged in within the last X hours?</h3>



<pre class="wp-block-code"><code>--login counts for the last 4 hours
SELECT &#91;Login] = login_name
	,&#91;Last Login Time] = MAX(login_time)
	,&#91;Number Of Logins] = COUNT(*)
FROM sys.dm_exec_sessions
WHERE &#91;login_time] &gt; DATEADD(HH,-4,getdate())--modify date as needed
GROUP BY &#91;login_name]
ORDER BY &#91;Login] desc</code></pre>



<h3 class="wp-block-heading" id="h-which-logins-have-sysadmin-access">Which logins have Sysadmin access?</h3>



<pre class="wp-block-code"><code>--check for logins with sysadmin access
SELECT &#91;Login] = name
	,&#91;Login Type] = type_desc
	,&#91;Disabled] = is_disabled
FROM     master.sys.server_principals 
WHERE    IS_SRVROLEMEMBER ('sysadmin',name) = 1
ORDER BY &#91;Login]</code></pre>



<h3 class="wp-block-heading" id="h-checking-a-user-s-access-to-databases">Checking A User’s Access To Databases</h3>



<p>This query will return a list of databases that the specified user has access to. This will work for SQL Server Logins as well as Active Directory Logins.  This is done by specifying&nbsp;<a href="https://docs.microsoft.com/en-us/sql/t-sql/statements/execute-as-transact-sql" target="_blank" rel="noreferrer noopener">EXECUTE AS LOGIN</a>&nbsp;just before the query. (also be sure to specify&nbsp;<a href="https://docs.microsoft.com/en-us/sql/t-sql/statements/revert-transact-sql" target="_blank" rel="noreferrer noopener">REVERT&nbsp;</a>after the query runs)</p>



<p>It should be noted that this query only shows you what databases the user can access, not the permissions the user has on the databases.</p>



<pre class="wp-block-code"><code>EXECUTE AS LOGIN = 'YourDomain\User.Name' --Change This
	SELECT &#91;name]
	FROM MASTER.sys.databases
	WHERE HAS_DBACCESS(&#91;name]) = 1
REVERT</code></pre>



<h2 class="wp-block-heading" id="h-want-to-work-with-the-sero-group">Want to work with The SERO Group?</h2>



<p>Want to learn more about how The SERO Group helps organizations take the guesswork out of managing their SQL Servers? <a href="https://theserogroup.com/contact-us/" target="_blank" rel="noreferrer noopener">Schedule a no-obligation discovery call</a>&nbsp;with us to get started.</p>
<p>The post <a href="https://theserogroup.com/dba/useful-scripts-for-sql-server-logins-and-permissions/">Useful Scripts For SQL Server Logins and Permissions</a> appeared first on <a href="https://theserogroup.com">The SERO Group</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">6676</post-id>	</item>
	</channel>
</rss>
