Hi,
I have written a C# console application that adds a message to a SB queue and a C# stored procedure that reads the message from the queue.
I have had so many problems that I'm beginning to doubt the usefulness of this, but that is probably just my frustration speaking.
I am stuck on putting an XML message into the queue and reading the XML in the stored procedure.
The console app has a simple object called Message. This class has 2 fields, a guid and a string. I serialize the object into XML using XmlSerializer. This results in an XML string that looks like this:
<?xml version="1.0" encoding="utf-16"?>
<Message xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http:/
/www.w3.org/2001/XMLSchema">
<TranId>a5b4a32f-4940-46ea-a928-3aae33a067fe</TranId>
<Text>TEST</Text>
</Message>
I use a SqlCommand to SEND the message to the queue. I use the following code to SEND the message:
// Add dialog handle
SqlParameter paramDialogHandle = new SqlParameter("@.dh", SqlDbType.UniqueIdentifier);
paramDialogHandle.Value = this.DialogHandle;
cmd.Parameters.Add(paramDialogHandle);
// Add message
SqlParameter paramMsg = new SqlParameter("@.msg", SqlDbType.VarChar,4000);
paramMsg.Value = msg.ToXml();
cmd.Parameters.Add(paramMsg);
// Build the SEND command
cmd.CommandText = "SEND ON CONVERSATION @.dh " +
"MESSAGE TYPE [http://www.TestSB.com/msg/HelloWorld] " +
"(@.msg)";
cmd.ExecuteNonQuery();
For some reason, this message never gets to the stored procedure that is set up to be activated for this queue. Actually, it never even gets written to the queue.
However, If I change the type of parameter in the SEND above to SqlDbType.Xml, the message is written into the queue, but without the '<?xml version="1.0" encoding="utf-16"?>' beginning it, so when my stored procedure tries to deserialize the xml, using XmlSerializer, it fails.
Obviously, I have overcome the backflips needed to do XmlSerialization within the SQLCLR, and I am getting the stored procedure activated, because I do get a message logged showing the XML that is received.
Since I have the queue set up to only allow valid XML messages, I'm assuming that inclusion of the <? xml version..../> line prevents the message from passing Service Broker's XML validation. (And by the way, I have never seen an error message placed into the queue, the event log, or thrown when non-xml is passed to a queue that has XML validation turned on. The message just does not show up in the queue. Is this a bug?) But the lack of the line prevents XmlSerializer from working.
How the heck do you serialize an object into XML using XmlSerializer, so that it can be deserialized with XmlSerializer?
Thanks
]Monty[
On the Service Broker specific issues I advise you to try following the steps in this trubleshooting article I posted at http://blogs.msdn.com/remusrusanu/archive/2005/12/20/506221.aspx. What you will most likely find is that when it appears that you send the message and the message vanishes, you actualy gonna find an error message in the sender's service queue with the error complaining about the message being invalid XML formated.
You see, the XML validation of messages is always performed by the target service when the message is received, not by the sender when the message is sent. Therefore if you send an incorrect XML message, it will not trigger an error in the SEND statement. However, since I assume your message type is defined as WELL_FORMED_XML, the message will not pass the XML validation when is attempted to be enqueued in the target queue and an error message will be sent back. This is the error message you'll find in your sender's queue.
The XML formating problems you are seing are due to the parameter types you use. It makes a big difference on the XML if you pass it in as an VARCHAR(4000), an NVARCHAR(4000) or an XML datatype. VARCHAR types will support UTF-8 encoded XML, therefore your serializer needs to create UTF-8 XML. NVARCHAR type will support UTF-16 encoded, and you need to serialize it accordingly. I'm not an .Net XmlSerializer expert, but I believe you can specify the encoding to be used. I know for one that you can create an XmlTextWriter and specify the Encoding.UTF8 or Encoding.UTF16, then use this XmlTextWriter to obtain the xml stream from the XmlSerializer. Similarily, you can specify the encoding in the XmlTextReader when deserializing the XML.
However, if you use the XML datatype, then you shouldn't need to specify explicitly the encoding. It seems you got this working on the SEND side, but you cannot get the message body as an XML on the receive side. How do you obtain this message in the C# client in the receiver side? Are you using SqlCommand.ExecuteReader to parse the resultset of a RECEIVE statement? What you need to do is to make sure that you get back from the server an XML datatype, not an VARBINARY(MAX). Simply use a RECEIVE ... CAST(message_body AS XML) for this to happen.
HTH,
~ Remus
|||
Hi Remus,
Thanks for the reply. Yes, what you've posted about the data types was part of the problem. However, I have never been able to find any trace of a poorly formed XML error message, in the target's queue or in the initiator's queue. If you say so, I guess that's right, but it doesn't work for me.
I've gotten around the serialization problem by using nvarchar(max) on the send and receive side and the <? xml .../> is making it across fine now. However, now XmlSerializer is complaining about the first line of the XML that follows. I've given up on this and am in the process of simplifying things back down to were they were once somewhat working predictably.
My problem now is that I can only run my test once and get an error message, and then I must shut down Visual studio 2005 (because it is holding a connection open somewhere) Drop and then create the database in Sql Mgmt Studio, and then bring up VS2005 again and run my console application. If I run it 2 times in a row, the stored procedure stops getting activated, even though in my postdeployscript.sql, I am dropping and then recreating the activation of the newly installed stored proc using the following:
ALTER QUEUE [TestSB Queue] WITH ACTIVATION (DROP);
go
alter queue [TestSB Queue]
with status=on,
activation (
status = on,
procedure_name = ServiceProc,
max_queue_readers = 2,
execute as self);
That brings up another question: Why is it you must drop the activation and then create it again in order to get the activation to fire on the newly deployed stored procedure? I understand that the assembly containing the stored proc has a different version, but it took me forever to figure out that you had to do a drop first, and that the alter queue [xx] Activation(....) wasn't enough.
Anyway, I have been hitting my head against the wall with this for all week. Tomorrow I am going to start over with new solution and try to simplify things even more to get to the bottom of this. If you forced me to go into production right now with ServiceBroker, I'd have to complain that it isn't ready. Or I'm not ready at best. I've had so many flakey experiences so far, I know my confidence has been eroded. There is too little documentation and apparently not a lot of experience with C# programs written under Visual Studio that write to queues that are serviced by SQLCLR stored procedures. There is very little in the way of samples, etc. Heck the words 'postdeployscript' and 'predeplyscript', necessary things for doing SB stuff within VS, can't even be found in the Visual Studio help!
In the end, all I want to do is serialize a message through SB to some code that makes a web service call. Boy is that a tall order.
]Monty[
Are you ending the sending dialog after the SEND? If that's the case, then when the error comes back it will find the dialog already ended and therefore it will simply delete the dialog, as well as the error message. If you do something like BEGIN DIALOG/SEND/END, this is similar to a 'fire and forget' scenario, where you (the sender) can never know if your message actually reached the target.
Anyway, using the Profiler can reveal a great deal of what's going on with the dialog messages, select the events in the Broker category.
Unfortunately I cannot help you too much with the CLR and Visual Studio deployment problems, since I'm unfamiliar with them myself. The dedicated .Net forum is monitored by experts with more knowledge in that area, http://forums.microsoft.com/MSDN/ShowForum.aspx?ForumID=86&SiteID=1, it would worth the trouble to describe the problem there. There are also some blogs of experts in this area, like http://blogs.msdn.com/sqlclr/
One explanation for the weird behavior you see would be is your activated procedure never returns. You can verify this in the Profiler (the activation started/stopped events) as well as by querying the sys.dm_broker_activated_tasks view.
HTH,
~ Remus
1. You do not have to ALTER QUEUE in order to use the newly installed stored proc. If there are any instances of the stored proc already running, they will continue to run the old version. However, the next time Service Broker activates a new proc it should automatically launch the newly deployed version. This behavior is exactly the same as that of SQL when ALTER PROCEDURE can update the procedure without killing spids that are already running the older version. However, any new invocation will use the new code.
2. You should be able to close the connection to the database from Visual Studio by opening the Server Explorer pane, navigating to the database, right clicking and selecting 'Close Connection'.
3. You do not have to DROP activation before reconfiguring it. You can adjust any/all of the settings of activation (viz, status, procedure_name, max_queue_readers and execute as user) using a single ALTER QUEUE statement.
4. Here's a resource on XmlSerialization in SQLCLR:
http://blogs.msdn.com/sqlclr/archive/2005/07/25/Vineet.aspx
The steps I perform are as follows:
SqlCommand cmd = conn.CreateCommand()
cmd.CommandText = @."WAITFOR (RECEIVE TOP(1) message_body, ..., FROM [YourQueue]), TIMEOUT 5000";
cmd.Transaction = tran;
SqlDataReader reader = cmd.ExecuteReader();
if (reader.Read())
{
SqlBytes body = reader.GetSqlBytes();
// XmlSerializer xs;
YourObject yo = (YourObject) xs.Deserialize(body.Stream);
//...
}
5. If you install the samples that come with the SQL Server DVDs, you should find a HelloWorld_CLR sample as well as a ServiceBrokerInterface library which is a sample programming model for writing service broker application. The library also has a shopping cart sample.
|||I just wanted to caution you about using SQLCLR to make outbound HTTP requests. When you make a web request, note that you are doing so as the user running sqlservr.exe. If this is a machine account, you may want to think about the consequences of that principle making outbound HTTP connections to proxy servers (or worse the destination web server itself).
Secondly, if you make a blocking web request call, you will be holding the transaction locks on the conversation group (and anything else that the stored proc may have locked) until the response is received and parsed.
Thirdly, you are using up one of sqlservr's threads for doing the web request, preventing the server from doing real database work until the request returns.
Finally, how are you going to handle errors from the webserver? Either you will have to rollback the transaction (but then the message will pop up again on executing a RECEIVE) or use some sophisticated mechanism of logging pending requests and retrying them using some retry logic. If you chose to rollback, remember that rolling back 5 times consecutively will disable the queue to prevent the poisoned message from repeatedly activating the queue.
If any of these issues concern you then you might want to think about re-architecting the solution to use an external ADO.NET app that receives the messages rather than an internally activated stored proc. If you want the ADO.NET app to be dynamically launched you could try the external activator mentioned here (http://rushi.desai.name/Blog/tabid/54/EntryID/7/Default.aspx).
|||Hi guys,
Thanks for the replies. This is really going somewhere now :)
1) Getting errors when sending non-xml thru a 'well formed xml' queue:
Yes, I am ending the dialog after the send. Are you saying that I need to turn around and wait for a response error message when the messsage is read? If so, how do I keep the transaction from becoming a super long one? That's one of the points I'm trying to make. This is a fire and forget queue. I'm basically trying to do a one-way message. I think the 'well-formed-xml' validation is not to useful in this case. I can't wait around for the reading of the message to generate a returning error message, that might be in an hour or so from now, and the transaction will have to be open all that time. I think a better approach is to turn off the validation and deal with it in the application code.
2) Doing an ALTER QUEUE after a new stored procedure is installed.
Interesting theory. Unfortunately, I have *NEVER* been able to get the stored procedure to fire after a deploy from visual studio. I spent 2 days on this until I tried doing a drop and create on the activation and finally got the stored procedure to be called. I'm obviously doing something wrong, but I can't for the life of me figure out what it is. And it only works for me if I do the drop and then the create. The create is not enough for me.
3) Making outbound HTTP requests from SQLCLR
Well, reliable messaging is the whole point of my exercise. There is no Indigo yet and QueuedComponents/MSMQ is not supported officially by Microsoft in a clustered environment, and a clustered environment is required for fault tolerance in production since there is no disaster recovery for MSMQ. Believe me, I have been there. QueuedComponents/MSMQ is not an option. So I'm left with SB queues with stored procedures making web service calls. I understand the transacional concerns that have been raised, and handling errors is a big unsolved problem because of the poison message 'feature' (can it be turned off?), and the fact that there is no way to delay a message from being received again, and the fact that the stored procedure is supposed to read all the messages before it stops. Assuming the web server is down, if I requeue the message back to the queue again, the stored procedure will just keep reading the same message over and over. I really don't know how to handle errors when the web service call fails, I haven't been able to get that far yet.Are you saying that internal activation is not the way to do this, and external activation is the way to go? Does it help with handling errors or just avoid the resource usage problem of http request inside of SQLCLR?
4) Closing connections to the database by closing the connections in server explorer.
I do this but Visual studio is still holding a connection, because I can't drop the database unless I close VS. Even closing the solution doesn't work. If I didn't have the activation problem, I wouldn't have to go through this step, so solving the activation problem would lessen this problem a bunch.
Thanks
]Monty[
EDIT: I was just checking out the external activation samples, but it uses that ServiceBroker Interface code. So the example only really shows how to use it, rather than how to do external activation. Are there any simplified examples available that don't use the interface code? I'm sorry but that code is not commented very well and is very complicated. I guess if that is all there is available then I can try to make it out, but I started my current excerise using it, and eventually gutted it down to the basic parts trying to understand it and I've obviously messed it up.
Hi again,
On the subject of the external activation sample:
Imagine that the environment is a classic 3 tier architecture, a UI or Web Service layer, an application layer that the UI talks to through web services, (the UI and the app layer are web farms not clustered using the clustering service but clustered with nlb) and finally the database server layer. How is a Windows Service going to be safely running in this environment? I need a clustered Windows Service layer in order to provide fault tolerance for the windows service. It doesn't exist. Do I run a copy of the service on each web service machine in the app layer? How do I get them all to coordinate? Does the sample provided have this support?
]Monty[
Monty Hi,
I'm really glad you're making progress. I'm sure we'll nail this down in no time.
But first of all, you are making some wrong assumtions about how Service Broker works and on how to write Service Broker apps. Service Broker messages will not be sent until a transaction commits. So definetely there will be no long transaction left open waiting for the error, since the error by definition won't come until you commit your transaction :). The model is like this: you send the message and commit. Then the application goes about it's normal business. Later (as you pointed out, it may be hours later), either an error comes back (mallformed XML), either the dialog times out (this is just another error message in your queue), either a response comes back. This message (errors are also messages) should be the trigger to continue your processing on this request (continue from where the SEND left). You can either use activation on the sender's side, or the application can have a dedicated listener thread that sits in a WAITFOR(RECEIVE...) loop. In the later case, the application should be, of course, prepared to deal with the case when it just started up and will find response messages for requests it made a week ago, the last time it was running.
BTW, in case you noticed that Service Broker messages appear to be sent immeadetly, before the transaction is commited, this is just an optimization we do in the case when the target service resides in the same instance as the sender. The 'sent' messages are actually locked by the sender's transaction and won't be available for RECEIVE in any other transaction until the sender's commit. And if the optimization cannot be performed for whatever reason (target database offline, target queue disabled, target conversation locked etc), the message will take the normal path (through sender's sys.transmission_queue).
Now about the WebService calls.
The HTTP requests per say are very expensive to be done from inside SQL Server. So it is a matter of resource consumtion to be external, not internal.
On how to handle a message like this (that requires a potential failure in an expensive operation, like a WS call), my proposed handling is this:
begin transaction
receive message
if message is web request
save state of request (http address, caller etc)
else is message is retry timer message
load state of request
endif
save a retry timer on the dialog (say 1 minute) using BEGIN DIALOG TIMER
commit
do the web request (no transaction open)
if success
begin transaction
reset retry timer
send back response
end conversation
commit
endif
This way you don't held long transaction (nothing is worse to a database than those!) and you have a persistent retry timer, stored in the database. You will retry your web request even after a server restart or a failover (cluster or mirroring), because dialog timers are persisted. And you don't rollback in case of web call failure.
HTH,
~ Remus
|||
Remus Rusanu wrote: But first of all, you are making some wrong assumtions about how Service Broker works and on how to write Service Broker apps. Service Broker messages will not be sent until a transaction commits. So definetely there will be no long transaction left open waiting for the error, since the error by definition won't come until you commit your transaction :). The model is like this: you send the message and commit. Then the application goes about it's normal business. Later (as you pointed out, it may be hours later), either an error comes back (mallformed XML), either the dialog times out (this is just another error message in your queue), either a response comes back. This message (errors are also messages) should be the trigger to continue your processing on this request (continue from where the SEND left). You can either use activation on the sender's side, or the application can have a dedicated listener thread that sits in a WAITFOR(RECEIVE...) loop. In the later case, the application should be, of course, prepared to deal with the case when it just started up and will find response messages for requests it made a week ago, the last time it was running.
Hi Remus
You hit the nail on the head, I am completely confused on how this works. I thought that was pretty obvious from my previous posts. :)
So to summarize, when I send a message, I do the SEND, and commit the transaction, but I don't END CONVERSATION until I get back a response or an error. 1) I'm not sending back a response on the receive side, and 2) I'm ending the conversation as soon as the send completes. Also, since my sender is a console application, I should turn around and wait for the response/errors, before sending another message, correct?
TIA
]Monty[
|||
Hi again,
Another question?
After sending the message (and not ending the conversation), do I receive on the queue I did the send on or the client queue that I had to set up to create the initiator service?
Here's the ddl for the queues, messages, and contracts:
PRINT 'create message SBTest.Message1';
CREATE MESSAGE TYPE
[SBTest.Message1]
validation = NONE;
PRINT 'create contract SBTest.Contract';
CREATE CONTRACT [SBTest.Contract]
(
[SBTest.Message1] SENT BY INITIATOR
);
go
--*********************************************
--* Create the [SBTest] service
--*********************************************
PRINT 'create queue ''[SBTest Queue]''';
CREATE QUEUE [SBTest Queue];
PRINT 'create service ''[SBTest]''';
CREATE SERVICE [SBTest]
ON QUEUE [SBTest Queue]
(
[SBTest.Contract]
);
go
--*********************************************
--* Create the [SBTest Client] service (the client.exe program)
--*********************************************
PRINT 'create queue ''[SBTest Client Queue]''';
CREATE QUEUE [SBTest Client Queue];
PRINT 'create service ''[SBTest Client]''';
CREATE SERVICE [SBTest Client]
on queue [SBTest Client Queue];
-- no contract because it only initiates messages
go
Do I do a recieve on [SBTest Queue] or [SBTest Client Queue]?
TIA
]Monty[
|||Response messages will be sent by the target service to the initiator service. Hence they will be delivered to the initiator queue and you should receive them from there (i.e. [SBTest Client Queue]) in your case. If you want the response message to actually contain a real message body, you will need to add a new message type and alter the contract. If your responses do not contain any info but simply acknowledge the initiator that its request has been handled, you could simply end the conversation on the target which will send the special 'End Conversation' message back to the initiator as follows:
Initiator Target
Begin Tran
Begin Dialog
Send a request
Commit
Begin Tran
Receive
Process message
End conversation
Commit
Begin Tran
Receive
If 'End Dialog' message
End conversation
Else If 'Error' message
Log/MsgBox/Email
End conversation
Commit
As Remus explained handling the responses could be performed in a background thread of your console application. If you need the user to be alerted of errored dialogs immediately, this background thread could signal the UI thread to pop up a message box. Or it could simply the log the failures to a table so that the user can look at the table to see what failed later. If you don't need the responses to be processed immediately, you could setup a periodic task (you can do that using SSB itself without requiring SQL Agent) that processes the responses.
|||Monty Shaw writes:
Also, since my sender is a console application, I should turn around and wait for the response/errors, before sending another message, correct?
This depends on whether the user need to block and wait for the response before submitting the next request. But as you said, the user is not really interested in the response and hence you could simply keep accepting new requests on the console and then submit them using "Begin Dialog/Send" in a transaction. Processing of responses can be performed asynchronously in a background task or deferred to a batched task that runs every night.
Later,
Rushi
Ok guys,
I created a small visual studio solution that does things the way you've suggested. I've zipped up the entire solution. I can not get the stored procedure to activate and consume messages out of the queue.
In order to recreate what I am seeing, you will have to:
1. unzip this zip
2. change the databse references in the Database project and the SqlServer project.
3. run the create.sql script to create the database.
4. deploy the solution
5. run the activate.sql script to set up the activation of the stored procedure
6. run the Client.exe (run without debug).
The first time you run this it may work. If so run it again. What I see after the first run, and a solution deploy, is that the stored procedure never again is activated to process messages. The stored proc is set up to just read the messages and then end the conversation along with writing messages into the log table. Nothing, nada.
Help!
Is there a way to upload a zip file? I wrote the above thinking I could attach a file to this post, but there doesn't appear to be a way to do that. If you can tell me how to do that or where to send it, I have a zip file just waiting.
]Monty[
|||You should wait for a response on [SBTest Client Queue].
The [SBTest] should end the conversation on success. The client will get back an error message on failure (for whatever reason: XML validation, conversation timeout, access denied on target service) or an end conversation message on case of success. The client could wait for a response with a WAITFOR(RECEIVE ... FROM [SbTest Client Queue] WHERE CONVERSATION_HANDLE = @.conversation_handle) where @.conversation_handle is the handle returned by BEGIN DIALOG.
BTW, if the initiator (sender) is an console application and there is no message sent back to the initiator, I'm not sure it makes sense for the console application to wait for a response, but I guess this is solely for the purpose of testing and figuring out how things work.
HTH,
~ Remus
No comments:
Post a Comment