none
Should I test Web Services using Visual Studio or SoapUI? RRS feed

  • Question

  • Hi,

    I have a dilemma of choosing between testing web services using visual studio or the dedicated SoapUI software, for testing web services at system level. We use TFS and Visual Studio 2010, we have automated builds, automated unit tests and integration tests.

    I was about to quickly choose SoapUI because it looked nice, but I then realized that it would also require some coding, as does if created in VS. It has some .NET plugin but don't know how it works.

    Thanks, I would really appreciate your input on these two options. 

    Wednesday, June 30, 2010 1:22 PM

All replies

  • Visual studio is the best option considering productivity. its much easier than soapUI

     

    >Bineesh

    Wednesday, June 30, 2010 6:55 PM
  • I would recommend using Visual Studio over any other tool.

    You can easily create automated testcases in the Visual Studio Test project.

    Let me know if you need more help on this. I have always used visual Studio for any such task.

    Hope this helps.


    Vidya Vrat Agarwal. http://dotnetpassion.blogspot,com
    Wednesday, June 30, 2010 6:56 PM
  • I would recommend using Visual Studio over any other tool.

    You can easily create automated testcases in the Visual Studio Test project.

    Let me know if you need more help on this. I have always used visual Studio for any such task.

    Hope this helps.


    Vidya Vrat Agarwal. http://dotnetpassion.blogspot,com


    Thank you, the both of you, for your inputs. I guess you are right, it would be easiest to do in VS since all our other tests are already automated in our builds. We do our system test in MS Test Manager, so it felt somehow compelling to use something else than a coding tool for system level WS testing. But like I said, if we want to "do it right" in SoapUI, we need to do some scripting anyway.

    Do you have any "best practices" for doing the system level WS tests? I guess it will resemble unit tests a lot, with added validation for business rules. I am especially wondering about:

    - should I use some external data source, or is the most convenient way to "set-up" some test data prior to test run?

    - what would be the most convenient way to test validation and business rules for each test script run? I.e. one test script could test a set of validation rules for a specific use-case. I am thinking command -pattern with strategy, but how to parse xml, can XPath be used?

    - can these test cases be specified to be "system level" tests and integrated in a build for automatic execution post-build?

    Wednesday, June 30, 2010 7:39 PM
  • Note that web services can be a bit of a problem with automated testing - depending on how quickly they run.

    Web services may take time to crank up.  I've worked with some where they took a minute to reply.  Developers expect unit tests to run in a couple of seconds.  Especially if you're doing TDD you run tests very frequently, so something that sits there for minutes is simply inacceptable.

    You can often get round part of this by mocking the database calls.  If pretty much all your service does is database interfacing then that can mean all the test does is check field lists match.  Then of course you need to bear in mind that if you mocked those field lists then you're not checking the database fields - you're checking your mocked list of fields.

    So it depends what you're doing, but I would suggest that you want your service tests in a separate project so they can be run just occasionally.  You may want two test projects for service tests - one that uses mocking and one that uses a real database.  The latter to be run occasionally and probably as part of system testing. 

    Thursday, July 1, 2010 7:52 AM
  • Just to stress the database aspect.

    If you're going to do tests which involve a database then remember they need to be repeatable.  You need some way to easily restore ( or whatever )  that database prior to each run.  If setting that data base up will take some time, then you need to work out some mechanism allowing for that and you need to remember that this test is a long process.

    Thursday, July 1, 2010 8:05 AM
  • Just to stress the database aspect.

    If you're going to do tests which involve a database then remember they need to be repeatable.  You need some way to easily restore ( or whatever )  that database prior to each run.  If setting that data base up will take some time, then you need to work out some mechanism allowing for that and you need to remember that this test is a long process.


    We have unit tests that usually have mocked databases. For WS we are actually lacking in unit/integration tests, which is of course rather unacceptable. But this is business as usual :) The type of testing I am in my messages refering to are system -level tests, so I will be testing against a WS deployed in the system environment and therefore close to the real thing, therefore using areal database. You had a good point about a call taking a while. What our WS does is it runs some validations against the data in the request message and data in the DB, so there shouldn't be any long processing times such as in SOA testing cases.

    Regarding test data, I still don't know whether I should insert data in the test script, or if I should create sql insert scripts that are run automatically as the DB is deployed. We have done the latter in our manual test cases.

    Thursday, July 1, 2010 2:18 PM
  • I've often seen data inserted and then purged as part of the setup/tear down procedures of the unit tests.  It's a pain either way having to do all of that.  Sometimes I think that it would be best to just create a copy of the database for unit testing that gets restored from time to time to avoid having to write all these inserts and deletes in every test.
    Thursday, July 1, 2010 2:25 PM
  • I worked on the same type of project and requirements.

    Test data needs to be separate set of records upon which you want to run your unit/end-to-end test cases.

    I would recommend having a "SlimDB" which mimics the production's full DB.

    And you can restore it again in the TestCleanUp section before you exit.

     


    Vidya Vrat Agarwal. http://dotnetpassion.blogspot,com
    Thursday, July 1, 2010 8:14 PM
  • If your database dependent tests are separate then you have the flexibility.  Long run times only matter if the developer has to run the thing a lot.

    Personally, I would recommend building a set of test data and restoring this.  In this way you can add extra test data easily by using the application or whatever and create a new backup.

    Part of my reasoning here is that some people just don't get the scripting thing but they can still manage to do back up and restores through the sql management studio.  BAs tend to be in this group and these are often the best people to craft nice test data. 

    Friday, July 2, 2010 8:58 AM
  • Hi Vidya,

    Can you provide an example or tutorial on how to use visual studio to automate web services tests?

     

    Thanks!

    Wednesday, June 1, 2011 9:26 PM
  • Did you get any help on automating webservice test ? If yes , could you please share the example ?
    Tuesday, October 4, 2011 5:14 PM