locked
Automate the upgrading of MS Access Databases from prior versions? RRS feed

  • Question

  • Hello All:

    Here is my problem. I work for a very large corporation. I need to upgrade an estimated 1,000 Microsoft Access MDBs across many departments and many offices from prior version of Access to MS Access 2016. What tools can you all recommend that will make my life easier. Also, is it possible to automate this process in any way?

    Tuesday, July 28, 2020 5:59 PM

All replies

  • You can automate upgrading the file format from mdb to accdb, but beyond that, it requires manual interaction to validate ActiveX control, compile code, ... and of course, test.

    Perhaps the following may help

    https://www.devhut.net/2016/09/18/convert-mdbs-into-accdbs/

    If the databases are Access 2007+ (accdb extensions) then there is nothing for you to do.


    Daniel Pineault, 2010-2019 Microsoft MVP
    Professional Support: http://www.cardaconsultants.com
    MS Access Tips and Code Samples: http://www.devhut.net


    Tuesday, July 28, 2020 6:10 PM
  • I am not aware of any tool that can do that.

    Custom code could be written. Some of the issues to consider:
    * What is the current version? Some upgrades may need to take two hops.
    * Need to update the References (e.g. a reference to Excel 2003 will have to be updated)
    * If you are going A2016 64-bit then code changes will be needed assuming there are Declarations without PtrSafe
    * Need to check if there are any ActiveX controls used that do not have a modern equivalent (especially if 64-bit)

    And I am probably forgetting a few. This is a significant task. You will want some guidance from a seasoned Access developer.


    -Tom. Microsoft Access MVP

    Tuesday, July 28, 2020 6:13 PM
  • There are a good number of network crawer tools to take inventory of your existing computers, and even the files.

    In fact, this scanner tool was combined with Access to create a really great migration tool. This was in 2003. So that specific Access tool no longer exists. It was REALLY nice, since it would:

    identify which files had code, and those that don't.

    Identify last time the file was used. 

    Identify things like external references etc.

    So, you might scan and find 10,000 databases on your corporate network. However, you might say, no bother with files not touched in the last 2 years. 

    Now you down to 2000 files.

    Next, you might just say that any mdb file without code modules? Don't bother - they will read just fine.

    So, now you might be down to say 50 files that you ACTUALLY have to deal with.

    So you want to take a MASH hospital type of a approach here. You thus attack the problem and ONLY deal with files that make sense to deal with - the rest you can often not do anything.

    Now, what looks like a huge mountain to climb becomes a workable project, one with information and a task list to work on.

    They don't make the Access scanner/upgrade system anymore. But THEY STILL do use and have and offer the network crawler system.  And it supports Access.

    You can find a link here:

    Office Migration Planning Manager overview for Office 2010

    https://docs.microsoft.com/en-us/previous-versions/office/office-2010/cc179179(v=office.14)

    Deploying the OMPM File Scanner

    There are three possible methods for installing and running the OMPM File Scanner in your environment:

    1. Centrally-run scan   Download to an administrative client computer and use that computer to scan specified network locations. This method is the simplest scanning approach and is often used for small-scale, targeted assessments.

    2. Distributed scan from shared folder   Download to a central file share and use that share to scan all environment client computers. This method is useful for enabling a consistent file scan across a group of client computers.

    3. Decentralized scan   Download to each environment client computer for local scanning. This method will deliver the most efficient scan execution results.

    So, that tool creates a xml file. At that point, you pull it into say Access, or better yet say sql server, and run your reports. 

    And note this:

    If the AccessScan setting is set to 1 in Offscan.ini, scans Access files and displays results on the Access Compatibility tab of the reporting tool.

    So, it does gather information - including information about Access. I not used this tool in a VERY long time, but for a large project, this type of network scanning tool, and a tool that gives information about the "office version",  if VBA is present, when last used and more? It is the kind of tool I would certainly adopt.

    While the version is 2010, there does seem to be a newer and more updated version here for use with office 365:

    https://www.microsoft.com/en-us/download/details.aspx?id=11454#:~:text=The%20Office%20Migration%20Planning%20Manager%20%28OMPM%29%202010%20is,a%20Microsoft%20Office%202010%20or%20Office%20365%20deployment.

    So, no question there are some tools, and when you start having to deal with possible 1000's of files over many computers, then some form of automated file gathering, network crawler,  and reporting system is a must have.

    Regards,

    Albert D. Kallal (Access MVP 2003-2017)



    Tuesday, July 28, 2020 11:38 PM
  • maybe its time to rethink about the use of 1000 databases...even with a very large corporation i am pretty sure that each and everyone serves just a tiny fraction of the overall department's jobs..

    So check for each department what databases are used..consolidate them, put permissions to keep everyone at their place...

    At the end the final number should be below 100.

    Thursday, July 30, 2020 7:16 AM
  • This is often like the proliferation of Excel or other types of files.

    In many ways often the issue is you have folders full of Excel sheets, or Access data files.

    While those 1000's of Excel or Access files might be of little use?

    One thus has to build a plan of attack. So, some tools that give you a inventory, and information about when last opended etc.  can  be rather valuable here. 

    And I seen this issue even with PDF in some folders. Same goes for word files.

    So, office files and documents often pile up over time. The management challenge here simply what approach and management one is going to adopt for corporate file stanches. 

    I don't think this issue is a whole lot of different for Excel,  or Access. The only problem is that Access has a lot of a extra features and abilities.  So, is this just some data file (say like Excel), or is this some reporting tool that some department cobbled up over the years.

    And I seen the same for Excel. Often there is stunning work and code in that Excel sheet. And figuring out who and made the sheet, and how it works? Often it takes just as long to figure out how the sheet works as opposed to re-writing it!

    So, I 100% agree that this much sounds like a lot of documents. I recall some stories of a inventory having been taken of files, and they found in excess of 10,000 access files.  

    They had no idea!

    Ultimately,  some kind of inventory approach, and say archiving older files that not been used in say 2 years is a good starting approach.

    For databases that don't have code, then in most cases little or next to nothing needs to be done.

    It when you find a database with application parts. VBA code, forms, reports etc. that challenges arise. So one needs to zero in on those such files, and others  can quite much be left as is, or archived.

    Regards,

    Albert D. Kallal (Access MVP 2003-2017)

    Edmonton, Alberta Canada

    Thursday, July 30, 2020 6:01 PM
  • I do not see the point - of such a general upgrade project.  If we were discussing the users' desktop versioning - - yes definitely one wants to keep the organization reasonably current.  But individual apps - there is no compelling reason to convert them all as a bulk project.  If they work, they work.  The owner of the app should manage their update when/if it has a need.  
    Thursday, August 6, 2020 12:28 PM
  • One of the solutions is to:

    Restore the SQL Server 2000 backup on SQL Server 2008
    Set the compatibility level to 100
    Create a database backup on SQL Server 2008
    Restore the SQL Server 2008 backup on SQL Server 2012
    Set compatibility level to 110
    The downside of this solution is that there has to be 3 versions of SQL Server installed and the discontinued or deprecated T-SQL features would have to be manually removed.

    If there is no access to the live database from which the backup was taken from:

    Start SQL Server Management Studio and connect to the instance that contains the database
    In the database context menu, right-click a desired database and navigate to Tasks => Generate Scripts:


    This will invoke the Generate Scripts wizard. In the Introduction step of the wizard, click the Next button:

    In the Choose Objects step of the wizard, make sure that the Script entire database and all database objects option is selected and click the Next button:

    In the Set Scripting options step of the wizard:
    Select the Save scripts to a specific location option
    Specify whether database objects and data will be scripted to a single or multiple files, as well as the path and encoding of the generated scripts:

    Click the Advanced button
    In the Advanced Scripting Options window that will appear:
    In the Script for Server Version option, set SQL Server 2012
    In the Types of data to script option, set Schema and data
    Set the following options to True:
    Script Logins
    Script Full-Text Indexes
    Script Triggers
    Click the OK button

    Click the Next button
    In the Summary step of the wizard, click the Next button:

    In the Save Scripts step of the wizard, click the Finish button:

    Execute the generated SQL script(s) against the newer SQL instance
    Although the procedure listed above should work:

    The discontinued or deprecated T-SQL features would have to be manually removed
    It might fail if a database contains dependencies which SQL Server fails to recognize (e.g. dependencies to databases located on linked servers)
    If there is no access to the original database for any reason (e.g. migrating a database to an off-site location with no network connectivity), the procedures above are not applicable.
    Thursday, August 6, 2020 2:35 PM
  • The question here is about access files - not sql server.

    So, they have potentially 1000's of access files. They need to know if by up-grading to a newer version of office, can the large number of Access files still be used?

    And, the answer depends!

    Are they JUST mdb or accdb files spread all over the  place on shared folders, or are the files applications with VBA code etc. So, how easy, or how much work it is to upgrade to the next version of office will be determined by

    Cataloging and classifying the large number of access files.

    Looking at when each file was last used, and also determine if the access file has JUST data (quite easy to deal with), or do the large number of access files ALSO have VBA code in them?

    If this was just a few access files? Not much of a problem. But in some large companies with thousands of desktops? They scan their network with a crawler and found 100,000 access files in their company!!!

    So, this question was not about SQL server, but a large company upgrading to the next version of office, and the question was in the context that if they upgrade to the next version of office, can they use and deal with the large number of access files that have obviously grown like weeds through out the company's infrastructure.

    Regards,

    Albert D. Kallal (Access MVP 2003-2017)

    Edmonton, Alberta Canada

    Friday, August 7, 2020 4:59 PM
  • It is not that simple.

    They are upgrading to v-next office.

    It comes down to HOW many files they have, and WHAT kind of file.

    The IT department can't just up-grade and then "hope" on a wing and prayer that everything works.

    Just this week? Some machines on a clients site had become corrupter, or a virus. 

    So, they wiped them, and installed Office 2016 - all nice fresh and new.

    However, that machine was used to build and deploy some access applications. And with the current coivd issue, they have a LOT of remote users. (so they are using terminal services). The terminal servers are running the Access 2010 runtime (because it is free - no licensing required). Now most everyone is using office 2016, but the terminal servers are not. Now the remote users can't run the access application, because changes were made with Access 2016. So, they scrambled for a day, found a office 2010 disk and installed access 2010 on that box.

    So, no, you can't just HOPE they work. In this case about 50 people were down for the day. The other 50 non remote users were fine - but the idea that you JUST hope things will work? No, you going to have a train wreak in many cases if you ignore issues of upgrading office.

    Some years ago?

    I was at a presentation about a company upgrading their office. They did not think Access was used much. They did a automated network crawler scan of all corporate computers. They found 100,000 access files.

    So, if you have 5 computers? Not a issue. But for a large company with 1'000's of desktops? Then planning and dealing with a office upgrade can become a REALLY big deal. 

    So, yes, they need to inventory those files. If they are JUST data files then no worry. But what % of them have VBA code? You need to know if this is a few days of work, or will the office rollout take 2 months of upgrading those files. 

    This comes down to size and scope of the program at hand. If you have to travel 2 miles? Well, then walking is a possible solution. If you have to travel 2000 miles? Then walking is not a option.

    The above is why companies like Google or microsfot during interview questions ask what seems to be silly questions like:

    How would you move Mount Fuji?

    The reason for such questions is that in software and computers? The concept of walking vs taking a plane, or how large the scope of the problem will dictate if some kind of automated approach is warranted.

    They want to know if you are able to process problem solving in terms of how high, how far, how big.

    The same goes here. Some kind of inventory of the access files, when they were last touched/used. And what number of those files have VBA code in them. All of  this information is required to attack this problem.

    As noted, 3 computers and a few files? Sure, just upgrade and let them break.

    But, if you have 10,000 or in my example they found 100,000 access files? Different problem in terms of scope an a different kind of challenge. So, not all scenarios have the luxury of JUST upgrading office and lettings things break.

    You find the problems before hand, and determine beforehand if rolling out v-next of office will impact a few or a large number of users. 

    This all comes down to how many files you are dealing with, and trying to determine how many will cause a problem, and also per my other example? Well, if you upgrade to v-next office, then often touching or using those files will lock out, or cause problems for others that have NOT YET had their machines upgraded. The breaking of the terminal server example above is a perfect example.

    So, what you suggest can work in some cases - but for larger companies? Well, lets see if it breaks is not a practical solution to rolling out new software.

    As that interview question shows? A good developer will take into account the "scope" of the task. Or in layman's terms?

    How big, how far, how much?

    This kind of contextual thinking is required when dealing with larger software projects, or in this case the IT department having to deal with Access files. And that's why you encounter such kinds of questions during interviews for a job - because your ability to think in terms of scope of the problem is a valuable skill.

    Regards,

    Albert D. Kallal (Access MVP 2003-2017)

    Edmonton, Alberta Canada

    Friday, August 7, 2020 5:20 PM
  • I also don't see this as an IT role to do this blindly.  This is something that should be done by the developer and properly tested before being implemented.

    I can just imagine an automated upgrade and some critical db not working anymore.

    Normally, IT advises departments/users that the organisation will be upgrading and provide the means to do upgrades and testing beforehand to evaluate the implications of said upgrade.  IT does not just go ahead and alter, possibly mission critical system, without the developer's direct involvement.  By simply upgrading things, IT is assuming full responsibility over all the databases, I've never known an IT department even remotely interested in doing this.

    This is where VMs come in for testing and make each department responsible for their databases.


    Daniel Pineault, 2010-2019 Microsoft MVP
    Professional Support: http://www.cardaconsultants.com
    MS Access Tips and Code Samples: http://www.devhut.net



    Friday, August 7, 2020 5:21 PM