SSIS Scripts error :Object does not contain a definition for 'Success' - RESOLVED

Hi experts, I'm posting here because the SSIS forum doesn't get much activiity.

I have a real simple need to read a files creation date. If the file exists, I want to store the last modified date into a variable.

This is my C# scriot.

#region Namespaces
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;

namespace ST_85710691ef3245f294643629ecd93767

/// ScriptMain is the entry point class of the script. Do not change the name, attributes,
/// or parent of this class.

public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
#region Help: Using Integration Services variables and parameters in a script
/* To use a variable in this script, first ensure that the variable has been added to
* either the list contained in the ReadOnlyVariables property or the list contained in
* the ReadWriteVariables property of this script task, according to whether or not your
* code needs to write to the variable. To add the variable, save this script, close this instance of
* Visual Studio, and update the ReadOnlyVariables and
* ReadWriteVariables properties in the Script Transformation Editor window.
* To use a parameter in this script, follow the same steps. Parameters are always read-only.
* Example of reading from a variable:
* DateTime startTime = (DateTime) Dts.Variables["System::StartTime"].Value;
* Example of writing to a variable:
* Dts.Variables["User::myStringVariable"].Value = "new value";
* Example of reading from a package parameter:
* int batchId = (int) Dts.Variables["$Package::batchId"].Value;
* Example of reading from a project parameter:
* int batchId = (int) Dts.Variables["$Project::batchId"].Value;
* Example of reading from a sensitive project parameter:
* int batchId = (int) Dts.Variables["$Project::batchId"].GetSensitiveValue();
* */


    #region Help:  Firing Integration Services events from a script
    /* This script task can fire events for logging purposes.
     * Example of firing an error event:
     *  Dts.Events.FireError(18, "Process Values", "Bad value", "", 0);
     * Example of firing an information event:
     *  Dts.Events.FireInformation(3, "Process Values", "Processing has started", "", 0, ref fireAgain)
     * Example of firing a warning event:
     *  Dts.Events.FireWarning(14, "Process Values", "No values received for input", "", 0);
     * */

    #region Help:  Using Integration Services connection managers in a script
    /* Some types of connection managers can be used in this script task.  See the topic 
     * "Working with Connection Managers Programatically" for details.
     * Example of using an ADO.Net connection manager:
     *  object rawConnection = Dts.Connections["Sales DB"].AcquireConnection(Dts.Transaction);
     *  SqlConnection myADONETConnection = (SqlConnection)rawConnection;
     *  //Use the connection in some code here, then release the connection
     *  Dts.Connections["Sales DB"].ReleaseConnection(rawConnection);
     * Example of using a File connection manager
     *  object rawConnection = Dts.Connections[""].AcquireConnection(Dts.Transaction);
     *  string filePath = (string)rawConnection;
     *  //Use the connection in some code here, then release the connection
     *  Dts.Connections[""].ReleaseConnection(rawConnection);
     * */

    /// <summary>
    /// This method is called when this script task executes in the control flow.
    /// Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
    /// To open Help, press F1.
    /// </summary>
    public void Main()

        System.IO.FileInfo theFile =
              new System.IO.FileInfo(Dts.Variables["User::FullPath"].Value.ToString());

        if (theFile.Exists)
            Dts.Variables["User::FileModifiedDate"].Value = theFile.LastWriteTime;
        object ScriptResults = null;
        Dts.TaskResult = (int)ScriptResults.Success;



This line Dts.TaskResult = (int)ScriptResults.Success;
shows error: Object does not contain a definition for 'Success'

Any ideas? Thanks

Can you provide a little more background as to why this information is needed in your SSIS package? Is it driving additional workflow?

To be honest, and I hope genuinely helpful, this is a very complex way to get basic file information. You can do this trivially with a PowerShell script:

powershell.exe -Command "(Get-Item myFile.txt).CreationTime"

Passing the file name from a DTS variable might be tricky, but I would question the need for SSIS at all unless you have other things you need to do in a complicated workflow. If I had to do it that way, I'd find another way to get the file info and dump it into a table, then use a Data Flow to query it and put into a script if needed.

Thanks for your interest in my problem, @robert_volk
The objective is to loop thru all files in all folders and get the file path (this is working well) and the date last modified for each file. Insert those columns into a table which can be analyzed later by another process to determine which files are older than X days.....
I believe SSIS can adequately handle this task. As I'm way down the road with SSIS (and my PS skills are quite limited), I prefer to not change horses in mid-stream as they say.

Thanks again.


If in case you are interested in another method for getting this data:

DROP TABLE IF exists #f;
CREATE TABLE #f(fname nvarchar(512) null
, file_name AS substring(fname,1,charindex('*',fname)-1)
, last_modified AS try_cast(substring(fname,charindex('*',fname)+1,99) AS datetime));

INSERT #f EXEC xp_cmdshell 'for /R "C:\Windows\System32\drivers" %a in (*.sys) do @echo %~fa*%~ta'

DELETE #f WHERE fname LIKE 'ERROR:%' OR fname IS NULL  -- removes rows that would break the name/date parsing


If you are always looking for files older than a certain number of days (e.g. 100 days or older), you can substitute this for the other INSERT...xp_cmdshell command:

INSERT #f EXEC xp_cmdshell 'forfiles /P "C:\Windows\System32\drivers" /S /M "*.sys" /D -100 /C "cmd /c echo @path*@fdate @ftime"'

The for and forfiles commands are a little cryptic, but they are built-in to any Windows system and can be called via SQLAgent, xp_cmdshell, even PowerShell.

You can parameterize this by constructing a command variable and sticking the path (C:\Windows\System32\drivers) and file specification (*.sys) into the correct places.

1 Like

I think you should remove

object ScriptResults = null;

from your script.

I am not sure how you are getting the list of files - but I would recommend doing that in a single pass using a script task.

I have a script task I use instead of the standard foreach loop container - this task gets a directories contents, loops over the files found and builds a record set object. There are probably parts of this that you don't need - but it should give you an idea:

First part - Namespaces:

#region Namespaces
using System;
using System.IO;
using System.Data;
using System.Linq;
using System.Management;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;

Here we are including System.Linq so we can sort the results from the directory. Main code:

            IFormatProvider provider = null;

            // Get the root directory
            string rootDirectory = Dts.Variables["$Project::RootDirectory"].Value.ToString();

            // Create our dataset and datatable
            DataSet dsSorted = new DataSet();
            DataTable fileListTable = dsSorted.Tables.Add();
            fileListTable.Columns.Add("FileName", typeof(string));
            fileListTable.Columns.Add("FileDate", typeof(DateTime));

            // Get our file pattern search criteria
            string filePattern = Dts.Variables["$Package::FilePattern"].Value.ToString();

            // Get the files from the root directory that match our file pattern
            DirectoryInfo info = new DirectoryInfo(rootDirectory);
            FileInfo[] files = info.GetFiles(filePattern).OrderBy(p => p.LastWriteTime).ToArray();

            // Loop through each file found
            foreach (FileInfo file in files)
                // Add file information to the data table
                fileListTable.Rows.Add(file.Name, file.LastWriteTime);

            Dts.Variables["SortedFileList"].Value = dsSorted;
            Dts.TaskResult = (int)ScriptResults.Success;

And the enum:

        enum ScriptResults
            Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
            Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure

The end result is an object named SortedFileList - which can then be used in a foreach loop configured to loop over the resultset contained in the object.

The foreach loop is configured with variables for each of the items returned in the object - and those variables can then be used as needed.

Now - if the goal is to find the files older than a specific time, you can easily use this same code to compare the last write time to some date being passed in or calculate the date (for example - older than 30 days) and archive/delete/rename the files as needed.

At this point, there is no need to put them into a table to determine the age of the files - that can all be done in this script. And if the goal is to process the file, then archive said file - you can use the information in the object - pass that to a script task inside the for each loop or even a file task.

1 Like

I forgot to post back that I was able to get the package working to retrieve the date each file was created as I loop thru the file names.

The main missing component was this Imports System.IO in the Imports section of the script task.
And this is the script code:
If (fileInfo.Exists) Then
' Get last modified date
Dts.Variables("User::FileModifiedDate").Value = fileInfo.LastWriteTime
End If

Thanks @robert_volk @RogierPronk and @jeffw8713