none
Can a stored procedure call an SSIS package and pass parameters?

    Question

  • Greetings-

    I need to create an SSIS package that would be called from a stored procedure.  I would need to pass the SSIS package several parameters.  Is this possible?  I'm using SQL Server 2008.

    Thanks--

    Friday, January 23, 2009 1:49 AM

Answers

All replies

  • Hi BlackCatBone,

    You can consider the following approaches and the variants available:

    1. CLR stored procedure
      a. Make use of SSIS Object model to programatically start the package using Package.execute(...)
      b. Create an external process, which calls dtexec and the arguments that you wish to pass to dtexec
    2. T-SQL stored procedure
      a. Create a SQL Agent job, which uses the SSIS subsystem. In the stored procedure, invoke SQL Agent sp_start_job
      b. Use xp_cmdshell to call dtExec

    In 2(b), check out
    Example:  http://www.simple-talk.com/sql/sql-server-2005/executing-ssis-packages-/

    It requires sthe use of xp_cmdshell, which is by default disabled


    If you are doing a CLR stored procedure, you can make use of the SSIS Object model in order to start packages.
    Check out http://blogs.msdn.com/michen/archive/2007/03/22/running-ssis-package-programmatically.aspx , where Michael provides a good discussion on how to run SSIS packages programmatically.

     Thank you.

    Friday, January 23, 2009 2:38 AM
    Answerer
  • Thanks for the good information.  One quick question about your 2a point describing sp_start_job.  I did take a look at that procedure and I don't believe it allows parameter values to be passed.  Am I right or did I miss something?  This approach would be ideal if I could pass parameter values.  I'm not sure the xp_cmdshell approach would be acceptable in my shop although dtExec appears that it would provide all the capabilities I need.

    I appreciate the thorough overview!

    -BCB

     

    Friday, January 23, 2009 1:42 PM
  •  essentially, the stored proc would need to build the command string for dtexec.  however, i advise against using xp_cmdshell to call dtexec because xp_cmdshell poses a security risk and should only be used if absolutely necessary.  as a workaround, you can use a sql server agent job to call dtexec via the cmd_exec subsystem.

    hth


    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.
    Saturday, January 24, 2009 7:24 AM
    Moderator
  • Hi,

    You could expose your parameters via a Package Configuration configured to use SQL Server.  This way you could update your parameters and then execute your job via sp_start_job (without needing to provide parameters) or use dtexec.

    Cheers
    Jeff


    SQL Server MVP / WARDY IT Solutions, Solutions Architect
    Saturday, January 24, 2009 8:02 AM
  • BlackCatBone said:

    Greetings-

    I need to create an SSIS package that would be called from a stored procedure.  I would need to pass the SSIS package several parameters.  Is this possible?  I'm using SQL Server 2008.

    Thanks--



    Take a look at this video, maybe it will help.

    http://www.jumpstarttv.com/media.aspx?vid=500&

     

    Saturday, January 24, 2009 4:57 PM
  • Jeff Wharton said:

    Hi,

    You could expose your parameters via a Package Configuration configured to use SQL Server.  This way you could update your parameters and then execute your job via sp_start_job (without needing to provide parameters) or use dtexec.

    Cheers
    Jeff


    SQL Server MVP / WARDY IT Solutions, Solutions Architect

    can you please tell me how this would work.  specifically, how would the configuration be updated.  the solution doesn't seem very dynamic to me.

     

    thanks in advance



    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.
    Monday, January 26, 2009 4:52 AM
    Moderator
  • Hi,

    2 above states:

    T-SQL stored procedure
    a. Create a SQL Agent job, which uses the SSIS subsystem. In the stored procedure, invoke SQL Agent sp_start_job
    b. Use xp_cmdshell to call dtExec

    Instead of passing the parameters to dtexec, you would use the parameters to update the values in the configuration table and then execute sp_start_job which in turn executes the package.  The package would then automatically read the new configuration values from the configuration table. This way you don't need to use dtexec (requiring the creation of a command string) and you wouldn't need to use xp_cmdshell.

    Cheers
    Jeff


    SQL Server MVP / WARDY IT Solutions, Solutions Architect
    Monday, January 26, 2009 5:29 AM
  • Jeff Wharton said:

    Hi,

    2 above states:

    T-SQL stored procedure
    a. Create a SQL Agent job, which uses the SSIS subsystem. In the stored procedure, invoke SQL Agent sp_start_job
    b. Use xp_cmdshell to call dtExec

    Instead of passing the parameters to dtexec, you would use the parameters to update the values in the configuration table and then execute sp_start_job which in turn executes the package.  The package would then automatically read the new configuration values from the configuration table. This way you don't need to use dtexec (requiring the creation of a command string) and you wouldn't need to use xp_cmdshell.

    Cheers
    Jeff


    SQL Server MVP / WARDY IT Solutions, Solutions Architect

    ok.  i see that your solution works, but it doesn't seem very efficient because it requires a table update.  also, creating the command string is a rather trivial pursuit and dtexec can be be called from sql server agent via the cmdexec subsystem -- xp_cmdshell isn't required.


    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.
    Monday, January 26, 2009 6:06 AM
    Moderator
  • Duane Douglas said:

    Jeff Wharton said:

    Hi,

    2 above states:

    T-SQL stored procedure
    a. Create a SQL Agent job, which uses the SSIS subsystem. In the stored procedure, invoke SQL Agent sp_start_job
    b. Use xp_cmdshell to call dtExec

    Instead of passing the parameters to dtexec, you would use the parameters to update the values in the configuration table and then execute sp_start_job which in turn executes the package.  The package would then automatically read the new configuration values from the configuration table. This way you don't need to use dtexec (requiring the creation of a command string) and you wouldn't need to use xp_cmdshell.

    Cheers
    Jeff


    SQL Server MVP / WARDY IT Solutions, Solutions Architect

    ok.  i see that your solution works, but it doesn't seem very efficient because it requires a table update.  also, creating the command string is a rather trivial pursuit and dtexec can be be called from sql server agent via the cmdexec subsystem -- xp_cmdshell isn't required.


    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.



    So you are stating that you can pass parameters to sp_start_job that can then be passed onto dtexec?

    Cheers
    Jeff


    SQL Server MVP / WARDY IT Solutions, Solutions Architect
    Monday, January 26, 2009 6:10 AM
  • Jeff Wharton said:

    Duane Douglas said:

    Jeff Wharton said:

    Hi,

    2 above states:

    T-SQL stored procedure
    a. Create a SQL Agent job, which uses the SSIS subsystem. In the stored procedure, invoke SQL Agent sp_start_job
    b. Use xp_cmdshell to call dtExec

    Instead of passing the parameters to dtexec, you would use the parameters to update the values in the configuration table and then execute sp_start_job which in turn executes the package.  The package would then automatically read the new configuration values from the configuration table. This way you don't need to use dtexec (requiring the creation of a command string) and you wouldn't need to use xp_cmdshell.

    Cheers
    Jeff


    SQL Server MVP / WARDY IT Solutions, Solutions Architect

    ok.  i see that your solution works, but it doesn't seem very efficient because it requires a table update.  also, creating the command string is a rather trivial pursuit and dtexec can be be called from sql server agent via the cmdexec subsystem -- xp_cmdshell isn't required.


    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.



    So you are stating that you can pass parameters to sp_start_job that can then be passed onto dtexec?

    Cheers
    Jeff


    SQL Server MVP / WARDY IT Solutions, Solutions Architect

    i believe there are two ways to approach the solution.  if i'm not mistaken, sp_update_jobstep would first run after the command string step is built, then sp_start_job would run.  the other alternative is to dynamically create and start the job, and have it automatically tear itself down upon execution completion.


    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.
    • Proposed as answer by alina11 Tuesday, January 27, 2009 12:40 PM
    Monday, January 26, 2009 6:45 AM
    Moderator
  • So please explain to me how updating a job (which updates a table under the covers) or dynamically creating and starting a job every time you wish to change values (which requires updating multiple tables under the covers and is a lot of effort) is more efficient than updating a single configuration table and running sp_start_job!

    Furthermore, sp_update_job must be run from the msdb database.  Users must also be in the sysadmin fixed role or be granted one of the following SQL Server Agent fixed database roles in the msdb database:

    • SQLAgentUserRole
    • SQLAgentReaderRole
    • SQLAgentOperatorRole

    Also, only members of sysadmin can use this stored procedure to edit the attributes of jobs that are owned by other users.

    Cheers
    Jeff


    SQL Server MVP / WARDY IT Solutions, Solutions Architect
    Monday, January 26, 2009 6:58 AM
  • Jeff Wharton said:

    So please explain to me how updating a job (which updates a table under the covers) or dynamically creating and starting a job every time you wish to change values (which requires updating multiple tables under the covers and is a lot of effort) is more efficient than updating a single configuration table and running sp_start_job!

    Cheers
    Jeff


    SQL Server MVP / WARDY IT Solutions, Solutions Architect

    which tables are updated by sql server agent when a job is created or updated?


    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.
    Monday, January 26, 2009 7:10 AM
    Moderator
  • Duane Douglas said:

    Jeff Wharton said:

    So please explain to me how updating a job (which updates a table under the covers) or dynamically creating and starting a job every time you wish to change values (which requires updating multiple tables under the covers and is a lot of effort) is more efficient than updating a single configuration table and running sp_start_job!

    Cheers
    Jeff


    SQL Server MVP / WARDY IT Solutions, Solutions Architect

    which tables are updated by sql server agent when a job is created or updated?


    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.

    nevermind.  i found them.


    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.
    Monday, January 26, 2009 7:13 AM
    Moderator

  • which tables are updated by sql server agent when a job is created or updated?


    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.



    The following tables in the MSDB database:

    dbo.sysjobs
    dbo.sysjobschedules
    dbo.sysjobsteps


    SQL Server MVP / WARDY IT Solutions, Solutions Architect
    Monday, January 26, 2009 7:14 AM
  • Jeff Wharton said:

    SQL Server MVP / WARDY IT Solutions, Solutions Architect
    which tables are updated by sql server agent when a job is created or updated?


    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.



    The following tables in the MSDB database:

    dbo.sysjobs
    dbo.sysjobschedules
    dbo.sysjobsteps


    SQL Server MVP / WARDY IT Solutions, Solutions Architect
    if i'm not mistaken, a proxy account would solve the security issues.

     

     

    also, i'm not following why sp_update_jobstep would need to update dbo.sysjobs and dbo.sysjobschedules.


    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.
    Monday, January 26, 2009 7:36 AM
    Moderator
  • Hi,

    A proxy account is used to give a job access to resources outside of SQL Server, not to provide a user with permissions to update jobs via sp_update_job or sp_update_jobstep.  Besides, having to setup special accounts to enable job modification is an additional overhead.

    You asked what tables would be affected by dropping and creating a job or updating a job so I listed the tables.  For clarity, all three tables would be updated when you dropped and created a job.  dbo.sysjobstep would be updated when you modified a job step.

    Given that there would be at least one table updated in your solutions, they're no more efficient than mine (actually, dropping and creating jobs is less efficient!).  Also, your solutions require additional permissions to enable the dropping, creating or updating of a job.

    Cheers
    Jeff


    SQL Server MVP / WARDY IT Solutions, Solutions Architect
    Monday, January 26, 2009 7:47 AM
  • Jeff Wharton said:

    Hi,

    A proxy account is used to give a job access to resources outside of SQL Server, not to provide a user with permissions to update jobs via sp_update_job or sp_update_jobstep.

    You asked what tables would be affected by dropping and creating a job or updating a job so I listed the tables.  For clarity, all three tables would be updated when you dropped and created a job.  dbo.sysjobstep would be updated when you modified a job step.

    Given that there would be at least one table updated in your solutions, they're no more efficient than mine (actually, dropping and creating jobs is less efficient!).  Also, your solution requires additional permissions to enable the dropping, creating or updating of a job.

    Cheers
    Jeff


    SQL Server MVP / WARDY IT Solutions, Solutions Architect

    ok.  i'll concede that your solution is more efficient than dynamically creating and dropping a job.  but i'm not sure if it's more efficient than updating a job step.  if a package is stored in msdb, then there is no need for it to access any other database to enable package execution...which is something that your solution would require.


    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.
    Monday, January 26, 2009 7:57 AM
    Moderator
  • Hi,

    Both solutions require:

    1. Accessing MSDB to execute the job
    2. Accessing another table (whether it be in MSDB or a client database) to update parameters.  Both of which would perform the same.

    My solution does not require any special permissions to change parameter values as is the case for updating a job step.

    The whole purpose of package configurations is to provide the means of updating values at runtime.  Why not use them as intended!

    Cheers
    Jeff


    SQL Server MVP / WARDY IT Solutions, Solutions Architect
    Monday, January 26, 2009 8:11 AM
  • Jeff Wharton said:

    Hi,

    Both solutions require:

    1. Accessing MSDB to execute the job
    2. Accessing another table (whether it be in MSDB or a client database) to update parameters.  Both of which would perform the same.

    My solution does not require any special permissions to change parameter values as is the case for updating a job step.

    The whole purpose of package configurations is to provide the means of updating values at runtime.  Why not use them as intended!

    Cheers
    Jeff


    SQL Server MVP / WARDY IT Solutions, Solutions Architect
    i'm not sure that changing databases to update a table wouldn't result in a performance penalty, but i imagine that it could.  also, creating an ssis proxy account is a trivial matter.  in fact, this is the recommended approach when executing packages via sql server agent jobs.

     

    while it is true that main purpose of package configurations is to dynamically pass values to a package at run-time, package configurations themselves weren't intended to be modified at run-time.  quite frankly, i think that your solution (although workable) adds an unnecessary layer of complexity when executing packages with dynamic values.



    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.
    Monday, January 26, 2009 8:21 AM
    Moderator
  • Creating a Proxy when executing packages via SQL Server Agent jobs is not the recommended approach.  Proxies are required if a job accesses resources external from SQL Server. 

    Quote from BOL: Transact-SQL job steps do not use SQL Server Agent proxies. Instead, the job step runs as the owner of the job step (this is specified on the Job tab), or as the SQL Server Agent service account if the owner of the job step is a member of the sysadmin fixed server role. Members of the sysadmin fixed server role can also specify that Transact-SQL job steps run under the context of another user by using the database_user_name parameter of the sp_add_jobstep stored procedure. For more information, see sp_add_jobstep (Transact-SQL).

    Creating a proxy account is trivial, however it's another requirement in your overall solution that provides no benefit as it won't be used.  Besides, a proxy will not get around the need for special permissions required to modify jobs.  You still need to assign these permissions to the account making the changes, not the account executing the job.

    When you use package configurations that are stored in SQL Server, they are no different than those stored in XML.  You are updating parameter values stored within.  You're not modifing the package configuration itself and I never sugegsted that at all.

    So, your whole solution is based around using a proxy account which won't work and the developer now needs to have higher permissions and understand how to modify jobs. 

    Updating a SQL table and executing a job is far from complex and it amazes me that you beleve that creating proxy accounts (which also requires creating a SQL Server credential mapped to a NT account!), assigning permissions and updating a job is not adding an unnecessary layer of complexity!


    SQL Server MVP / WARDY IT Solutions, Solutions Architect
    Monday, January 26, 2009 8:45 AM
  • Jeff Wharton said:

    Creating a Proxy when executing packages via SQL Server Agent jobs is not the recommended approach.  Proxies are required if a job accesses resources external from SQL Server. 

    Quote from BOL: Transact-SQL job steps do not use SQL Server Agent proxies. Instead, the job step runs as the owner of the job step (this is specified on the Job tab), or as the SQL Server Agent service account if the owner of the job step is a member of the sysadmin fixed server role. Members of the sysadmin fixed server role can also specify that Transact-SQL job steps run under the context of another user by using the database_user_name parameter of the sp_add_jobstep stored procedure. For more information, see sp_add_jobstep (Transact-SQL).

    Creating a proxy account is trivial, however it's another requirement in your overall solution that provides no benefit as it won't be used.  Besides, a proxy will not get around the need for special permissions required to modify jobs.  You still need to assign these permissions to the proxy account.

    When you use package configurations that are stored in SQL Server, they are no different than those stored in XML.  You are updating parameter values stored within.  You're not modifing the package configuration itself and I never sugegsted that at all.

    So, your whole solution is based around using a proxy account which won't work and the developer now needs to have higher permissions and understand how to modify jobs. 

    Updating a SQL table and executing a job is far from complex and it amazes me that you beleve that creating proxy accounts (which also requires creating a SQL Server credential mapped to a NT account!), assigning permissions and updating a job is not adding an unnecessary layer of complexity!


    SQL Server MVP / WARDY IT Solutions, Solutions Architect

     

    How to run a SQL Server 2005 Integration Services package as a SQL Server Agent job step

     

    Method 2

    1. Create a Microsoft Windows user account that meets the security context requirement for the package.
    2. Create a credential in an instance of SQL Server 2005.

      Note A credential is a record that contains the authentication information. The authentication information is needed to connect to a resource that is outside SQL Server.
    3. Create a SQL Server Agent proxy in the instance of SQL Server 2005. When you create the SQL Server Agent proxy, perform the following operations for the SQL Server Agent proxy in the New Proxy Account dialog box:
      • In the Credential name box, click the credential that you created in step 2.
      • Under Active to the following subsystems, click to select the SQL Server Integration Services Package check box.
      Note SQL Server Agent proxies define the security context for a job step. SQL Server Agent proxies use credentials to store information about Windows user accounts. The user who is specified in the credential must have the "Log on as a batch job" permission on the computer that is running SQL Server 2005. You can grant the "Log on as a batch job" permission to a user in the Local Group Policy Editor.
    4. Create a SQL Server Agent job step that runs the package. When you create the job step, click the SQL Server Agent proxy that you created in step 3 for the job step in the Run as list in the New Job Step dialog box.

     

    In this case, the job step subsystem cmdexec, t-sql or ssis.  no permissions other than the sql server agent permissions are required to update a job step.

     


    in your solution, the configuration would would require updating because the values would be constantly changing.

     

     again, creating a proxy account is a best practice for executing a package via a sql server agent job, not an unnecessary layer of complexity.  furthermore, updating the sysjobsteps table is less error prone, less performance hit prone, easier to troubleshoot, and easier to maintain than updating a table in another database.



     



    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.
    Monday, January 26, 2009 9:22 AM
    Moderator
  • Nice of you to only post half of this article.  There are actually two best practices in this article, however you only found it appropriate to include the one that fits your solution.  Method 1 is what you'd use for my solution.  You could also use method 2 if you so wished :-)

     

    Method 1 discusses how to implement a SSIS job without using a proxy account and only include 2 simple steps. 

     

    1. Create a SQL Server Agent job step that runs the package. When you create the job step, click SQL Agent Service Account in the Run as list in the New Job Step dialog box.

      Note You must be a member of the sysadmin fixed server role to specify SQL Agent Service Account for the job step.
    2. Grant the appropriate permissions for the SQL Server Agent service account so that the SQL Server Agent service account meets the security context requirement for the package.

     

                Note These permissions are required for the package to access the necessary resources.

     

    In future, if you intend to post details of an article, you should include ALL options within an article, not just those that appear to support your conclusion!

     

    What you STILL fail to understand is that to modify a job via sp_update_jobstep has NOTHING to do with using a proxy account.  Proxy accounts are used during the EXECUTION of a step, not the MODIFICATION of a step.  You cannot update a job step from within the job while the job is executing.  The job needs to be updated PRIOR to the job executing.  Once you understand these concepts you'll understand that you WILL need additional permissions to modify the job as this can't be done under the context of the Proxy. 

     

    You can also see from your partial article that to use a Proxy account you need to also create a SQL Server Credential and make changes to the Local Group Policy as well.  Two more steps required to implement your solution. 

     

    Therefore, to implement your solution would require:

     

    1. Creating a SQL Server Credential mapped to a NT account that has permissions to access SQL Server
    2. Creating a proxy account mapped to the SQL Server Credential and then assigning this Proxy to the job
    3. Modifying the Local Group Policy of the server to give your proxy account permission to "Run as a Job"
    4. Updating the SQL Server Job using sp_update_jobstep with an account that is a member of the sysadmin fixed server role or that’s a member of one of the following SQL Server Agent fixed roles: SQLAgentUserRole, SQLAgentReaderRole, SQLAgentOperatorRole (this can't be the proxy as a proxy is only used for execution of a job).

    I'm starting to see how this is far simpler and more efficient than:

    1. Updating a few values in a table
    2. Executing sp_start_job

    The configuration in my solution would not require modification.  Example.

     

    You configure the package to accept 3 parameters A, B, C.  You can then modify these parameters at will.  You do not need to add D, E or F or remove B and C.  You are simply modifying parameter values.

     

    Updating the sysjobsteps table is far from less error prone as there are a number of parameters that can be passed into the sp_update_jobstep stored proc;  Any of which could corrupt the step if the correct one is not used. You could even use the wrong job_id, job_name or step_id and update the wrong step in the wrong job! 

     

    How is updating the sysjobsteps table less performance hit prone?  This table contains all the job step information for all jobs implement in that SQL Server instance so there could be potentially hundreds of steps stored and read in this table.  My table would include only a couple of records relating to my package only.

     

    How is the sysjobsteps table easier to troubleshoot and easier to maintain than updating a table in another database?  A table that only includes your records and not records relating to other jobs being executed by SQL Server Agent is far easier to troubleshoot and easier to maintain.

     

    Monday, January 26, 2009 11:01 AM
  • With regards to your solution, could you please provide an example of how you would achieve the following:

    1. Updating the job step with the new parameter detals
    2. Using an account with the appropriate role membership to update the job

     


    SQL Server MVP / WARDY IT Solutions, Solutions Architect
    Monday, January 26, 2009 11:15 AM
  • Jeff Wharton said:

    With regards to your solution, could you please provide an example of how you would achieve the following:

    1. Updating the job step with the new parameter detals
    2. Using an account with the appropriate role membership to update the job

     


    SQL Server MVP / WARDY IT Solutions, Solutions Architect

    first of all, in regards to my post about a proxy being a best practice, i didn't find it necessary to post the entire article.  i only wanted to prove that using a proxy is indeed a best practice.  the fact that there are other appropriate methods is irrelevant to the point i was making.

     

    secondly, in regards to providing a detailed example of my proposed solution, this is something i rather not do.  i'm a volunteer here -- i don't get paid by microsoft or anyone else to assist the people who post here.  considering that you're an mvp, i think you would agree that it is obvious that my proposed solution would work if it is designed correctly.  i don't at all doubt that you are more than capable to figure out the implementation details for yourself.



    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.
    Tuesday, January 27, 2009 8:06 AM
    Moderator
  • Hi,

    I agree that your solution would work if designed correctly. 

    This whole discussion started due to you stating that my solution was excessive and inefficient.  I then asked you to explain how your solution was more efficient and to this point you haven't done so.  In the end, it's up to the OP to decide based on explanation and examples which solution they wish to implement.

    I too don't get paid to post on this forum and I also don't get paid as an MVP.  BTW, I didn't get my MVP for working on forums, I got it from running a SQL Server User Group, Presenting at Tech-Ed in Australia, US and EMEA, presenting at other User Groups, devoting weekends assisting in running IT camps and doing other SQL related things.  I looked at this forum last week to try and find an answer to a problem I had and noticed a lot of unanswered questions so decided to help out.  Now that I've started I actually enjoy it and have seen a lot of good stuff on here.

    Given that you are a moderator on this forum I'm surprised at your lack of willingness to assist with providing an example.  I'm sorry, but I'd expect more from a forum moderator; especially if they disagree with a provided solution and were prepared to argue the case over 20 odd posts.

    Cheers
    Jeff


    SQL Server MVP / WARDY IT Solutions, Solutions Architect
    Tuesday, January 27, 2009 8:25 AM
  • Jeff Wharton said:

    Hi,

    I agree that your solution would work if designed correctly. 

    This whole discussion started due to you stating that my solution was excessive and inefficient.  I then asked you to explain how your solution was more efficient and to this point you haven't done so.  In the end, it's up to the OP to decide based on explanation and examples which solution they wish to implement.

    I too don't get paid to post on this forum and I also don't get paid as an MVP.  BTW, I didn't get my MVP for working on forums, I got it from running a SQL Server User Group, Presenting at Tech-Ed in Australia, US and EMEA, presenting at other User Groups, devoting weekends assisting in running IT camps and doing other SQL related things.  I looked at this forum last week to try and find an answer to a problem I had and noticed a lot of unanswered questions so decided to help out.  Now that I've started I actually enjoy it and have seen a lot of good stuff on here.

    Given that you are a moderator on this forum I'm surprised at your lack of willingness to assist with providing an example.  I'm sorry, but I'd expect more from a forum moderator; especially if they disagree with a provided solution and were prepared to argue the case over 20 odd posts.

    Cheers
    Jeff


    SQL Server MVP / WARDY IT Solutions, Solutions Architect

    i guess that's why i'm not an mvp.

    ok.  since you're an mvp and willing to share you're knowledge, why don't you share an entire detailed example of your proposed solution, including code?  that way, all of us can discuss the pros and cons of it.  i'm sure that we can all benefit from such an endeavor.


    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.
    Tuesday, January 27, 2009 12:25 PM
    Moderator
  • Already working on it and will have it posted tomorrow night (AEDST) :-)
    SQL Server MVP / WARDY IT Solutions, Solutions Architect
    Tuesday, January 27, 2009 12:30 PM
  • Jeff Wharton said:

    Already working on it and will have it posted tomorrow night (AEDST) :-)


    SQL Server MVP / WARDY IT Solutions, Solutions Architect

    if you don't mind, please put it in a separate thread so that it gets more visibility.

    thanks.



    Duane Douglas, MCAD | http://www.ssisbi.com | Please mark the post(s) that answered your question.
    Tuesday, January 27, 2009 12:34 PM
    Moderator
  • Sure.
    SQL Server MVP / WARDY IT Solutions, Solutions Architect
    Tuesday, January 27, 2009 12:35 PM
  • Can you please provide the link to access Jeff Wharton solution?

    Thursday, September 24, 2009 6:20 PM
  • Can you please provide the link to access Jeff Wharton solution?

    if i'm not mistaken, he hasn't provided it.

    Duane Douglas, MCAD, MCITP | http://www.ssisbi.com | Please mark the post(s) that answered your question.

    SSIS Business Intelligence

    Friday, September 25, 2009 6:11 AM
    Moderator
  • Duane, would you be able to help by providing some examples of how to implement your solution? basically what you asked Jeff to do.

    Thanks,

    Friday, September 25, 2009 9:17 PM