none
T4 Template support

    Question

  • Hi ,

    It seems that the March 2014 update has some support for t4 templates,  it may be me but it doesnt seem to work?!

    I have created a new project and added a t4 template which looks nicely defaulted to reading all tables.  I build the project , but fails due to the contents of the .sql file under the tt file.

    So i delete the contents of the file , rebuild, which then works.  Then using the "transform all t4 templates" in the build option annnnnndddd  , nothiing happens.

    As i say , probably me,  any chance of a tutorial / blog ?!

    Thanks


    Dave Ballantyne ---- http://sqlblogcasts.com/blogs/sqlandthelike/

    Wednesday, March 26, 2014 3:56 PM

Answers

  • Thanks for reporting this Dave!

    Sure enough a late change we made in the product broke this feature.  We moved the dac installation for VS under the VSInstallDir to provide better isolation for each VS version, and accidentally broke the T4 template directive processor.  I've got a fix on my machine and it'll go out with our next update.  Unfortunately there is no workaround.  The non-dac reading T4 templates still work since they use a much more vanilla directive processor which doesn't use the dacfx binaries.

    • Proposed as answer by CarnegieJ Wednesday, March 26, 2014 5:08 PM
    • Marked as answer by Dave_Ballantyne Friday, March 28, 2014 1:08 PM
    Wednesday, March 26, 2014 5:07 PM
    Owner

All replies

  • Thanks for reporting this Dave!

    Sure enough a late change we made in the product broke this feature.  We moved the dac installation for VS under the VSInstallDir to provide better isolation for each VS version, and accidentally broke the T4 template directive processor.  I've got a fix on my machine and it'll go out with our next update.  Unfortunately there is no workaround.  The non-dac reading T4 templates still work since they use a much more vanilla directive processor which doesn't use the dacfx binaries.

    • Proposed as answer by CarnegieJ Wednesday, March 26, 2014 5:08 PM
    • Marked as answer by Dave_Ballantyne Friday, March 28, 2014 1:08 PM
    Wednesday, March 26, 2014 5:07 PM
    Owner
  • Ha, nice to know you are human too :)

    Im looking forward to the fix,  should help to ease a few clunky dynamic TSQL routines.  The code comments also stated that is read the last built dacpac and not the in-memory model.  This seems to me that some processes will not be as smooth as they could be,  i guess you would effectively have to 'double build' the dacpac in that case.

    Also, is it presently compatible with sqlpackage ?


    Dave Ballantyne ---- http://sqlblogcasts.com/blogs/sqlandthelike/


    Wednesday, March 26, 2014 5:59 PM
  • Visual Studio runs custom tools in their own AppDomain and unfortunately our DacFx model is not serializable and therefore cannot cross an AppDomain boundary.  For now, yes, you'll have to double-build sometimes, but our hope is that this will be minimal.

    We'll release a blog when the fix goes out.

    I'm not sure what you mean about compatible with SqlPackage.

    Note that the normal T4 template item template works fine.  If you have code generation needs that don't rely on the model you should be unblocked now (i.e. generation of table from a CSV file).

    Wednesday, March 26, 2014 10:44 PM
    Owner
  • Hi Patrick,

    Yeah, re : SQLPackage there is no implication for here, my bad.

    I see what you mean about the boundaries,  i guess that is why the line in the code below "PrjItem.Object.GetType()" fails with an invalid cast specification ?!? Im no expert here and that well outside my comfort zone :)

    Ive had a play and come up with something equivalent that iterates through the project reading the files directly and dumping into a Model, just for fun :)

    <#@ template language="C#" debug="true" hostspecific="true" #>
    <#@ assembly name="Microsoft.VisualStudio.Shell.Interop.8.0" #>
    <#@ assembly name="EnvDTE" #>
    <#@ assembly name="EnvDTE80" #>
    <#@ assembly name="VSLangProj" #>
    <#@ assembly name="C:\Program Files (x86)\Microsoft SQL Server\120\SDK\Assemblies\Microsoft.SqlServer.TransactSql.ScriptDom.dll" #>
    <#@ assembly name="C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\Microsoft.SqlServer.Dac.dll" #>
    <#@ assembly name="C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\120\Microsoft.SqlServer.Dac.Extensions.dll" #>
    <#@ import namespace="Microsoft.VisualStudio.Shell.Interop" #>
    <#@ import namespace="EnvDTE" #>
    <#@ import namespace="EnvDTE80" #>
    <#@ import namespace="Microsoft.VisualStudio.TextTemplating" #>
    <#@ import namespace="Microsoft.SqlServer.Dac" #>
    <#@ import namespace="Microsoft.SqlServer.Dac.Model" #>
    <#@ import namespace="System.IO" #>
    <#@ import namespace="System.Collections.Generic" #>
    <#@ output extension=".sql" #>
    
    -- Dynamic File generated by db
    <#
    	var hostServiceProvider = (IServiceProvider)this.Host;
    	var dte = (DTE)hostServiceProvider.GetService(typeof(DTE));
    	
    	using (TSqlModel model = new TSqlModel(SqlServerVersion.Sql110, new TSqlModelOptions { }))
    	{
    		foreach(Project project in dte.Solution)
    		{
    			IterateThroughProject(project.ProjectItems,model);
    		}
    
    		List<TSqlObject> allTables = GetAllTables(model);
    		foreach (var table in allTables)
    		{
    #>
    --				Table <#= table.Name.Parts[0] #>.<#= table.Name.Parts[1] #>
    <#
    			 
    		}
    	}
    
    #>
    -- File Done
    <#+
    
    	public List<TSqlObject> GetAllTables(TSqlModel model)
    	{
    	    List<TSqlObject> allTables = new List<TSqlObject>();
    
    	    var tables = model.GetObjects(DacQueryScopes.All, ModelSchema.Table);
    	    if (tables != null)
    	    {
    			allTables.AddRange(tables);
    	    }
    	    return allTables;
    	}
    
    
    	
    	private void IterateThroughProject(ProjectItems PrjItems,TSqlModel model)
    	{
    		foreach(ProjectItem PrjItem in  PrjItems)
    		{
    
    			if(PrjItem.Name.EndsWith(".tt", StringComparison.OrdinalIgnoreCase)){ // Dont Load the files we want to build
    				continue;
    
    			}
    			if(PrjItem.ProjectItems!=null)
    			{
    				IterateThroughProject(PrjItem.ProjectItems,model);
    			}
    			if(//PrjItem.Object.GetType().ToString() == "Microsoft.VisualStudio.Data.Tools.Package.Project.DatabaseFileNode" && 
    				PrjItem.Name.EndsWith(".sql", StringComparison.OrdinalIgnoreCase))
    			{
    #>
    --					This is a sql file and will be processed
    --				<#= PrjItem.FileNames[0] #>
    <#+
    				if (!PrjItem.Saved)
                    {
                        PrjItem.Save();
                    }
                    StreamReader Reader = new StreamReader(PrjItem.FileNames[0]);
    
                    string Script = Reader.ReadToEnd();
                    model.AddObjects(Script);
    			}
    		}
    	}
    
    #>


    Dave Ballantyne ---- http://sqlblogcasts.com/blogs/sqlandthelike/


    Friday, March 28, 2014 1:02 PM
  • Yes, that's why it fails.  If the object being marshaled isn't serializable it'll throw.  BTW - our latest release (April) fixes the T4 template generation out of the box.  Your blog post on T4 templates was very cool!
    Friday, April 18, 2014 3:55 PM
    Owner
  • Thanks for reporting this Dave!

    Sure enough a late change we made in the product broke this feature.  We moved the dac installation for VS under the VSInstallDir to provide better isolation for each VS version, and accidentally broke the T4 template directive processor.  I've got a fix on my machine and it'll go out with our next update.  Unfortunately there is no workaround.  The non-dac reading T4 templates still work since they use a much more vanilla directive processor which doesn't use the dacfx binaries.

    Hi Patrick,

    6 months on, I presume this is fixed in the most current release of SSDT?

    JT

    Thursday, September 25, 2014 11:30 AM