"YOU AND THE ART OF ONLINE DATING" is the only product on the market that will take you step-by-step through the process of online dating, provide you with the resources to help ensure success. Get it now!
If I understand correctly, you are interested in the pros and cons of using XML as a format of data between the database and the application (in this case, a web app).
If I understand correctly, you are interested in the pros and cons of using XML as a format of data between the database and the application (in this case, a web app). If you happen to have the entire data to be inserted/updated/deleted as a handy bag of data in your client, then actually sending it as XML makes sense. On simple reason is that this would allow for a single database round-trip to the server, and reducing round-trips is always a good think.
But the most important advantage is that you can employ the holy-grail of database performance: set oriented processing. Using XML methods, specially nodes and value, combined with some moderate XPath-fu skills, and you can shred the entire XML parameter received from the application into relational sets, and use set oriented operations to do the database writes. Take for instance the XML in your post, lets say that it was passed as an parameter named @x of type XML.
You can shred that into an attributes to be merged into existing elements: select x. Value(N'@ID', N'int') as ID, x. Value(N'.', N'varchar(max)') as Value from @x.
Nodes('//Elementnot(@Action="delete") and not (@ID=0)/Attr') t(x) You can shred the attributes that go into new elements: select x. Value(N'@ID', N'int') as ID, x. Value(N'.
', N'varchar(max)') as Value from @x. Nodes('//Element@ID=0/Attr') t(x); And you can shred the elements to be deleted: select x. Value(N'@ID', N'int') as ID from @x.
Nodes('//Element@Action="delete"') t(x); These sets can be manipulated via normal SQL DML: inserted, deleted, updated or merged into the EAV tables, in one single pass. Note that the XML shredding I show here are trivial ones and probably incorrect for you, but are just to show the way to do it. Now whether this is the best path to go, I don't know.
There are way too many variables and moving pieces and they lay mostly in your dev team skill set and existing code base. For sure, XML is a good format to call into the database for updating sets, but XML has its shortcomings too: is verbose and fat, is slower to parse than binary formats, and is actually quite difficult to fully grok by programmers: once you get past the sugar coat of '', there a deep (and sometimes messy) layer of XPath, XQuery, namespaces, encodings, cdata and the rest. I'd say go ahead, prototype, let us know how it goes...
Remus, all the reasons you mention are the exact ones I was thinking about! I do have all the data at once and can stuff it into a property bag, as you say. And exactly so, I can query the XML repeatedly to extract different data sets and then do my joins (row-by-row is truly evil).
In the past I've not been impressed with XML because of how verbose and slow it is. And I also have my reservations about all the extra gunk you mentioned like namespaces, encodings, and cdata (I want to be able to handle any unicode character properly). – ErikE Mar 8 '10 at 17:25 So... given that you jumped right in gave sample queries for me, it seems that you sort of approve of the XML route?
Of course, for now I am stuck with SQL 2000. Do you have any other thoughts on the format of my XML (e.g. , is Action="delete" and the mix of element-based and attribute-based data sensible) or on any other good ways to hand in an entire property bag? Last night I did realize that all the things I'm updating share a fairly standard representation, and if I add ElementID1 and ElementID2 to the existing update process, then I could do without XML.
– ErikE Mar 8 '10 at 17:28 Also, I have concerns about creating the XML in the first place. I've found the MSXML libraries to be painfully slow, even to build XML, and I've done the (gasp! Horror!
) workaround of just concatenating the XML together manually in the past. But that only works when it's kind of simple, and this looks to be like it won't be so simple. – ErikE Mar 8 '10 at 17:29 I do not disprove of XML, but I can't recommend it neither since, as you are clearly aware, there are just too many moving pieces.
But I will tell you a short story from my professional history. I deal a lot with Service Broker based systems, and in SSB the norm is to get the data as XML: dequeue a message from a queue, then start processing the XML. You can see on my blog at rusanu.
Com/2006/10/16/writing-service-broker-procedures how various styles of processing payload impact performance. XML set oriented is pretty much the fastest way. – Remus Rusanu Mar 8 '10 at 19:07 However, I am finding myself removing set oriented processing and replacing it with cursor based processing, row-by-row ('row' extracted from the XML).
The reason for this is complexity. Maintaining the set oriented procedures is just hard to do, projects are taking longer than needed, have more bugs, simply because is hard to wrap one's head around the complex XPath queries involved. And a major problem is error handling: is hard to detect if any xml element was simply not-processed (ie.
Is skipped by all XPath queries) – Remus Rusanu Mar 8 '10 at 19:10.
I don't see any reason not to use XML columns in SQL Server 2005, and to do all your work via stored procedures. You probably don't have time to abstract your data access to hide the ugliness of the data model, so why not just access it as-is, using XML? You can use XQuery in SQL Server to do updates, queries, etc.Now that I think of it, you might still put one layer of abstraction between the ASP pages and the database.
That would allow you in the future to use XSLT to transform the structure of your XML into a format that will perform better in the database.
" Some brief examples of what you're talking about would be good. I don't mean working code, I'm just having trouble connecting what you're saying to the real world. For example, what format would perform better in the database?
Note that the question here isn't about reading the database--that works fine. It's about writing back to the database. Though I suppose if multiple requests are working for reading, then multiple requests for writing could be okay.
It's just not so simple to submit entire recordsets as parameters to SPs. – ErikE Mar 7 '10 at 1:48 @Emtucifor: SQL Server 2005 has a new column data type, XML. See technet.microsoft.
Com/en-us/library/ms190936%28SQL.90%29.aspx. Sorry, I thought you mentioned SQL Server 2005 because you knew about XML columns. – John Saunders Mar 7 '10 at 2:06 I do know about XML columns, but I don't think actually storing the data as XML is the right route, here.
Were you suggesting changing the database structure to use XML columns? Or were you instead suggesting the web application insert to an XML column and then do its updates from there? Or am I missing that an input parameter can be XML data type?
– ErikE Mar 8 '10 at 18:51 Any thoughts on my last comment? – ErikE Mar 31 '10 at 18:09 @Emtucifor: I had been suggesting XML columns. But since you're stuck on SQL Server 2000 for a while, given that you're using Classic ASP, and given what @Remus said about severity 16, I'm not so sure anymore.
– John Saunders Mar 31 '10 at 18:45.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.