通过NSInputStream/NSOutputStream传输多个图像
我有两个NSInputStream
和NSOutputStream
通过网络相互连接。我想将核心数据对象和相关图像从一个设备传输到其他设备。我已成功将核心数据对象转换为JSON
并转移到流的另一端,然后从JSON
填充核心数据。现在有与每个记录相关的图像。图像在光盘上,只有存储在核心数据对象中的路径。现在,当你写信给时,你必须有完整的数据。我准备好了XML
(包含JSON
)。
1.但是,如何将图像(NSData *
)与XML
(还有NSData *
)一起传输?在XML和图像之间,我将如何区分阅读端(NSInputStream
)?
2.此外,我必须传输多个图像,我们如何告诉NSInputStream
一个图像的字节已经完成并且下一个图像的字节已经开始?
3.我们如何知道哪个图像(名称)已被传输?
感谢通过NSInputStream/NSOutputStream传输多个图像
我解决它通过使用以下步骤:
1。转换每个被管理目标以NSDictionary
2.将所有所有字典中NSArray
3.转换的NSArray
到NSData
使用NSKeyedArchiver
4.通过数据流传输NSData
而在接收端,我反转上面的s TEPS。
谢谢Marius Kurgonas
转换NSData
(每个UIImage
),以NSString
表示,然后把所有的NSString
对象为NSDictionary
和序列化字典。通过这种方式,当您通过数据传输数据时,可以逆转该过程以提取图像,从而知道哪个图像的哪些关键点。这样你应该可以传输多个图像。
希望这会有所帮助。
干杯
听起来很可笑,每一个答案。尝试是这样的:
case NSStreamEventHasBytesAvailable: {
NSLog(@"NSStreamEventHasBytesAvailable");
uint8_t * mbuf[DATA_LENGTH];
mlen = [(NSInputStream *)stream read:(uint8_t *)mbuf maxLength:DATA_LENGTH];
NSLog(@"mlen == %lu", mlen);
[mdata appendBytes:(const void *)mbuf length:mlen];
NSLog(@"mdata length == %lu", mdata.length);
if (mlen < DATA_LENGTH) {
NSLog(@"displayImage");
UIImage *image = [UIImage imageWithData:mdata];
[self.peerConnectionViewController.view.subviews[0].layer setContents:(__bridge id)image.CGImage];
mdata = nil;
mlen = DATA_LENGTH;
mdata = [[NSMutableData alloc] init];
}
} break;
...
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
CGContextRelease(newContext);
CGColorSpaceRelease(colorSpace);
UIImage *image = [[UIImage alloc] initWithCGImage:newImage scale:1 orientation:UIImageOrientationUp];
CGImageRelease(newImage);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
NSData *data = [NSData dataWithData:UIImageJPEGRepresentation(image, 0.25)];
__block BOOL baseCaseCondition = NO; // obviously this should be data driven, not hardcoded
__block NSInteger _len = DATA_LENGTH;
__block NSInteger _byteIndex = 0;
typedef void (^RecursiveBlock)(void (^)());
RecursiveBlock aRecursiveBlock;
aRecursiveBlock = ^(RecursiveBlock block) {
NSLog(@"Block called...");
baseCaseCondition = (data.length > 0 && _byteIndex < data.length) ? TRUE : FALSE;
if ((baseCaseCondition) && block)
{
_len = (data.length - _byteIndex) == 0 ? 1 : (data.length - _byteIndex) < DATA_LENGTH ? (data.length - _byteIndex) : DATA_LENGTH;
//
NSLog(@"START | byteIndex: %lu/%lu writing len: %lu", _byteIndex, data.length, _len);
//
uint8_t * bytes[_len];
[data getBytes:&bytes range:NSMakeRange(_byteIndex, _len)];
_byteIndex += [self.outputStream write:(const uint8_t *)bytes maxLength:_len];
//
NSLog(@"END | byteIndex: %lu/%lu wrote len: %lu", _byteIndex, data.length, _len);
//
dispatch_barrier_async(dispatch_get_main_queue(), ^{
block(block);
});
}
};
if (self.outputStream.hasSpaceAvailable)
aRecursiveBlock(aRecursiveBlock);
}
谢谢亲爱的。你的回答让我找到了解决办法。 –