Hi Just been pointed at xmblaster. I'm working on a distributed delivery mechanism for large files (>100Mb, probably most around 500Mb, some up to 2 or 3 Gb) Xmlblaster seems to offer a lot of what I need -- straightforward wrapping of content and metadata, infrastructure for both broadcast and p2p communications). However two questions: - Is xmlblaster robust and efficient for messages of this size? - I see that you have a mechanism for making messages persist in the face of the system crashing or presumably one end of a connection going going> SO far so good -- however for messages of this size I would want to be able to resume transfer from the point at which contact broke off --if I've tranferred 90% of a 1Gb message and the receiver gets pulled from the network, when it resumes I would want to only have to download the remaining 10% -- have you any thoughts on this? Thx -- looks interesting!
Description: S/MIME cryptographic signature