Posts

Showing posts from October, 2013

Playing With the Simple "AVSpeechSynthesizer" API's

iOS7 provides a way to use speech (Kind of text to speech conversion). To create an object, you do as below : AVSpeechSynthesizer *speechSynthesizer = [[AVSpeechSynthesizer alloc] init]; normally like any other NSObject. To let it speak, we need to pass the “ AVSpeechUtterance ” object to it using this API: [speechSynthesizer speakUtterance:/* AVSpeechUtterance object */ ]; So, to create an object for “ AVSpeechUtterance ”, you do as below : AVSpeechUtterance *utterance = [AVSpeechUtterance speechUtteranceWithString:@“Hey baby!”]; So, Utterance is the one that provides the text which will be spoken by the synthesizer. Also, it has few interesting properties, which could be tweaked based on our need. Like rate, volume, pitchMultiplier, preUtteranceDelay, postUtteranceDelay .  For example to slow down the speech, we could use : utterance .rate *= 0.5 ; whereas, preUtteranceDelay, postUtteranceDelay are used to delay the speech before and afte

Few Other Useful Additions to iOS7

Message UI Framework (Attach Files to Messages) : We have always ben using "MFMessageComposeViewController" to send messages, but we weren't allowed to add images to it. But, after iOS7, you could use : " - (BOOL)addAttachmentData:(NSData *)attachmentData typeIdentifier:(NSString *)uti filename:(NSString *)filename;" this API to attach images to the messages like below : if ([MFMessageComposeViewController canSendText] && [MFMessageComposeViewController canSendAttachments] && [MFMessageComposeViewController isSupportedAttachmentUTI:( NSString *)kUTTypePNG]) {      MFMessageComposeViewController *vc = [[MFMessageComposeViewController alloc] init];      vc.messageComposeDelegate = self ;      vc.recipients = @[ @"sender" ];      UIImage *myImage = [UIImage imageNamed: @"imageName.png" ];      BOOL attached = [vc addAttachmentData:UIImagePNGRepresentation(myImage) typeIdentifier:( NSString *)kUTType

Scratching the surface of NSURLSession

NSURLSession is the replacement of NURLConnection, which resolves the issues occurring with the latter. Session objects are created with the class methods which takes the configuration object. There are 3 types of possible sessions: Default Ephemeral, in-process session Background session NSURLSessionConfiguration *sessionConfig = [NSURLSessionConfiguration defaultSessionConfiguration]; The above configuration object has properties which controls the way it behaves, for instance, whether the cookies are allowed, timeouts etc. And also few more properties like : * allowsCellularAccess : Could be set to say the operating system whether the device is permitted to run the networking session when the cellular network is available. * discretionary : Could be set to say the operating system to run the networking session when the device is connected to Wifi or has good power. After creating the session configuration object, we can create the session object

UIKit Dynamics Overview

Yes, now you can have all your UI objects to behave like how objects in a physics environment behave. You can have physics concepts like Gravity, Collisions, springs etc..  added to the UI Objects. Apple updated the UIKit framework (which contains almost all the iOS UI Elements) such that any physics behavior could be added to the UIKit Elements/Objects. Hence,  2d physics engine lies just underneath the UIKit to simulate the real world objects. In order to use UIKit Dynamics, we use " UIDynamicBehavior " to apply different behaviors to objects, which binds to the " UIDynamicItem " protocol, UIView does this already for you, hence any subclass of UIView just works with this. You can even create your own objects binding to the above protocol. After setting/applying all the physics behaviors to the objects that we needed to have, we can provide them to the " UIDynamicAnimator " - think this as the controller of UIKit Dynamics, which handles/calculate