It’s important to monitor the file grow. Due to database activity, your files will accumulate records and this will trigger a grow. Now disk space is not expensive , never the less, you need to know when you are close to the limits of your disk space.
For this , you can use sys.master_files and/or sys.database_files (this is for the current database).
You need some additional objects :
- a table – holding the size of each file , per day – I call it dbo.FileSize
- a stored procedure – to insert information into this utility table- I call it = dbo.fileSize_add
- additional a job , running once a day, to call the previous stored procedure, this is doing the actual monitoring
- a script that make the conclusion: like what will be the size in 30 days ?
DROP TABLE IF EXISTS dbo.FileSize;
GO
CREATE TABLE dbo.FileSize
(
DBName VarChar(128)
, FileLogicalName VarChar(128)
, FilePhysicalName VarChar(512)
, Size_MB DECIMAL(15,2)
, InsertedDate DATE
, CONSTRAINT pk_FileSize__DBName_FileLogicalName
PRIMARY KEY CLUSTERED (DBName,FileLogicalName,InsertedDate)
);
GO
CREATE OR ALTER PROCEDURE dbo.fileSize_add
(
@i_InsertedDate DATE
)
AS
BEGIN
INSERT INTO dbo.FileSize( DBName, FileLogicalName, FilePhysicalName, Size_MB, InsertedDate)
SELECT
DB_NAME(mf.database_id) as dbName
,mf.name as fileLogicalName
,mf.physical_name as physicalName
,(mf.size * 8 )/1024.0 as fileSize_MB /* why * 8 ? because 8kb is for a page */
,@i_InsertedDate as InsertedDate
FROM sys.master_files as mf
CROSS APPLY sys.dm_os_volume_stats(mf.database_id,mf.file_id) as vs
WHERE
mf.database_id IN( DB_ID(‘db_workspace’) , DB_ID(‘WideWorldImporters’));
END
GO
In a next version, you can also monitor the size of your volume :
SELECT DISTINCT
vs.volume_mount_point as volumeName
,CONVERT(INT, vs.available_bytes / 1048576.0) as freeSpace_MB
,CONVERT(INT, vs.total_bytes / 1048576.0) as TotalSize_MB
FROM sys.master_files as mf
CROSS APPLY sys.dm_os_volume_stats(mf.database_id,mf.file_id) as vs
ORDER BY volumeName
Next, let’s start the monitoring by calling the sp and see what it’s in the target table:
EXEC dbo.FileSize_add @i_InsertedDate = ‘20180816’;
SELECT * FROM dbo.FileSize
DBName FileLogicalName FilePhysicalName Size_MB InsertedDate
db_workspace db_workspace D:\SQL\Databases\DATA\db_workspace.mdf 8.00 2018-08-16
db_workspace db_workspace_log D:\SQL\Databases\DATA\db_workspace_log.ldf 24.00 2018-08-16
WideWorldImporters WWI_InMemory_Data_1 D:\SQL\Databases\DATA\WWI\WideWorldImporters_InMemory_Data_1 0.00 2018-08-16
WideWorldImporters WWI_Log D:\SQL\Databases\DATA\WWI\WideWorldImporters.ldf 100.00 2018-08-16
WideWorldImporters WWI_Primary D:\SQL\Databases\DATA\WWI\WideWorldImporters.mdf 1024.00 2018-08-16
WideWorldImporters WWI_UserData D:\SQL\Databases\DATA\WWI\WideWorldImporters_UserData.ndf 2048.00 2018-08-16
step2: Simulate a database activity:
use db_workspace
go
drop table if exists dbo.test;
create table dbo.test
( idTest int identity(1,1) not null
,cTest char(2000) not null
)
insert into dbo.Test(ctest)
select top (100000) s.name
from sys.all_columns as c
cross join sys.all_columns as s
/*(100000 rows affected)*/
Step3 : next day, monitor again the size:
EXEC dbo.FileSize_add @i_InsertedDate = ‘20180817’;
SELECT DbName,FileLogicalName,Size_MB,InsertedDate FROM dbo.FileSize WHERE InsertedDate = ‘20180817’
DbName FileLogicalName Size_MB InsertedDate
db_workspace db_workspace 200.00 2018-08-17
db_workspace db_workspace_log 344.00 2018-08-17
WideWorldImporters WWI_InMemory_Data_1 0.00 2018-08-17
WideWorldImporters WWI_Log 100.00 2018-08-17
WideWorldImporters WWI_Primary 1024.00 2018-08-17
WideWorldImporters WWI_UserData 2048.00 2018-08-17
Step 4: additional activity:
use db_workspace
go
drop table if exists dbo.test2;
create table dbo.test2
( idTest int identity(1,1) not null
,cTest char(2000) not null
)
insert into dbo.Test2(ctest)
select top (1000) s.name
from sys.all_columns as c
cross join sys.all_columns as s
/*(1000 rows affected)*/
Step5 : next day, monitor again the size:
EXEC dbo.FileSize_add @i_InsertedDate = ‘20180817’;
SELECT DbName,FileLogicalName,Size_MB,InsertedDate FROM dbo.FileSize WHERE InsertedDate = ‘20180817’
DbName FileLogicalName Size_MB InsertedDate
db_workspace db_workspace 264.00 2018-08-18
db_workspace db_workspace_log 344.00 2018-08-18
WideWorldImporters WWI_InMemory_Data_1 0.00 2018-08-18
WideWorldImporters WWI_Log 100.00 2018-08-18
WideWorldImporters WWI_Primary 1024.00 2018-08-18
WideWorldImporters WWI_UserData 2048.00 2018-08-18
Step6: Conclusions:
- for the files, different that the log , we need to calculate the the grow from one day to another.
- Base on this will calculate the average grow, the minim grow, the maxim grow
- See in future, let’s say , in 30 days what will the size , base on the average or the maxim grow.
- For the log files, = ldf= they depends on the type of recovery model; because they will be trim by check point or log backup. Here it is interesting the maximum size in a day.
;WITH Src AS
(
SELECT FileLogicalName
,Size_MB as CurrentSize
,LAG(Size_MB,1,0) OVER(PARTITION BY DbName,FileLogicalName ORDER BY InsertedDate) AS PrevSize
,LAG(InsertedDate,1,NULL) OVER(PARTITION BY DbName,FileLogicalName ORDER BY InsertedDate) AS PrevDate
,Size_MB – LAG(Size_MB,1,0) OVER(PARTITION BY DbName,FileLogicalName ORDER BY InsertedDate) AS DeltaSize
,DATEDIFF(DAY,LAG(InsertedDate,1,NULL) OVER(PARTITION BY DbName,FileLogicalName ORDER BY InsertedDate),InsertedDate) AS DeltaDays
,DBName
,InsertedDate
,FIRST_VALUE(Size_MB) OVER(PARTITION BY DbName,FileLogicalName ORDER BY InsertedDate DESC) AS LastValue_Size
,FIRST_VALUE(InsertedDate) OVER(PARTITION BY DbName,FileLogicalName ORDER BY InsertedDate DESC) AS LastValue_Date
FROM dbo.FileSize as FS
WHERE
RIGHT(FilePhysicalName,3) <> ‘ldf’
AND FileLogicalName = ‘db_workspace’
)
SELECT FileLogicalName
,MAX(LastValue_Size) AS CurrentSize
,MAX(LastValue_Date) AS LastReportedDate
,CAST(SYSDATETIME() AS DATE) AS CurrentDate
,AVG(DeltaSize/NULLIF(DATEDIFF(DAY,PrevDate,InsertedDate),0)) AS AVG_DeltaSize
,MIN(DeltaSize/NULLIF(DATEDIFF(DAY,PrevDate,InsertedDate),0)) AS minDeltaPerDay
,MAX(DeltaSize/NULLIF(DATEDIFF(DAY,PrevDate,InsertedDate),0)) AS maxDeltaPerDay
,MAX(LastValue_Size) + 30* MAX(DeltaSize/NULLIF(DATEDIFF(DAY,PrevDate,InsertedDate),0)) AS Size30Days_Max
,MAX(LastValue_Size) + 30* AVG(DeltaSize/NULLIF(DATEDIFF(DAY,PrevDate,InsertedDate),0)) AS Size30Days_Avg
FROM Src
GROUP BY
FileLogicalName
FileLogicalName CurrentSize LastReportedDate CurrentDate AVG_DeltaSize minDeltaPerDay maxDeltaPerDay Size30Days_Max Size30Days_Avg
db_workspace 264.00 2018-08-18 2018-08-18 128.00 64.00 192.00 6024.00 4104.00
So , in 30 days , keeping the maxim grow in a day, will have size = 6024.00, or by calculating with average grow, the size will be = 4104.00.
And now, you can collaborate with the available space in the volume … and trigger some response, some mail…
Thanks,
S
#t-sql #fileGrow #sqlserver #mssqlserver #sql #databaseGrow