Posts

Showing posts from 2013

NSUndoManager with Core Data Objects

I am very new to using NSUndoManager , that too with core data objects. I found not too difficult, but it works in a different way when compared to the way i thought before. Here's my Imagination about NSUndoManager before using it: Once an object for " NSUndoManager " is created and assigned to any "NSManagedObjectContext"(MOC), just calling the "undo" method of " NSUndoManager " would undo all the changes made in the MOC. My imagination about " NSUndoManager " is dead simple right ? Of course, i know there are several features that we'll be missing if it works in that way, like you cant undo a specific operation and redo the same way. After learning step by step, got to know that these are all the few simple steps to use the " NSUndoManager " : (i) Create an instance for " NSUndoManager ", like this or as you wish with "alloc/init" and set that object to your MOC object :      

Search Mobile Safari for a Specific Word

Image
How many of you know that you can search for a particular word in the webpage loaded in safari on your iPhone/iPod/iPad ? Most of them might say "No", because i was one of them before. Because, the solution is kind of hidden from the user's direct view. It could save you even hours when you come across a situation where the webpage is too long and you just want to know about a specific topic, so that you can skip everything and just jump to the section you wanted. Here's the trick to find a specific word in the webpage : Let's say i am on "saru2020.blogspot.com" (This is not too lengthy yet). But, still we need to take an effort to find an article that we are searching for. Let's say you want to see only the articles related to "iOS7", then definitely that is a huge task which consumes time. Instead, we can just search with the keyword "iOS7". That is what exactly we're gonna do now. Now, g

Story of "SARAddressBookBackup" library, that helps to backup contacts in iOS Devices

  SARAddressBookBackup : This is an iOS library to Backup the Contacts in iOS Devices as .vcf file. Also, the Example project in Github illustrates on how to email the .vcf file. The .vcf file can also be accessed/shared through iTunes File Sharing. Story Behind its Birth : Searched the App Store for Apps that could backup all my contacts in my iPhone5. I would usually go mostly with free apps (Kind of a Freebie). When trying to backup my contacts, most of the apps asked me to purchase to backup contacts more than 500 or 1000 or whatever, completely depends on the app developer's decision and i have to abide their rules. I have more than 1000 contacts in my iPhone Contacts App. Finally, i found an app that backed up all my contacts, and it even allowed me to send the output file to my email, but that was not reliable (The app crashed several times). [Note : I didn't mention the App name here :) ] Just thought of how hard it is to develop an app that

MKLocalSearch in MapKit from iOS6.1

MKLocalSearch provides a simple way to find local points of interest within a geographic region. Because of its no-hassle webservice integration and tight integration with MapKit , any location-based app would do well to take advantage of it. Usage : MKLocalSearchRequest *request = [[MKLocalSearchRequest alloc] init]; request.naturalLanguageQuery = @"Hospitals"; request.region = mapView.region; MKLocalSearch *search = [[MKLocalSearch alloc] initWithRequest:request]; [search startWithCompletionHandler:^(MKLocalSearchResponse *response, NSError *error) {     NSLog(@"Map Items: %@", response.mapItems); }];

Playing With the Simple "AVSpeechSynthesizer" API's

iOS7 provides a way to use speech (Kind of text to speech conversion). To create an object, you do as below : AVSpeechSynthesizer *speechSynthesizer = [[AVSpeechSynthesizer alloc] init]; normally like any other NSObject. To let it speak, we need to pass the “ AVSpeechUtterance ” object to it using this API: [speechSynthesizer speakUtterance:/* AVSpeechUtterance object */ ]; So, to create an object for “ AVSpeechUtterance ”, you do as below : AVSpeechUtterance *utterance = [AVSpeechUtterance speechUtteranceWithString:@“Hey baby!”]; So, Utterance is the one that provides the text which will be spoken by the synthesizer. Also, it has few interesting properties, which could be tweaked based on our need. Like rate, volume, pitchMultiplier, preUtteranceDelay, postUtteranceDelay .  For example to slow down the speech, we could use : utterance .rate *= 0.5 ; whereas, preUtteranceDelay, postUtteranceDelay are used to delay the speech before and afte

Few Other Useful Additions to iOS7

Message UI Framework (Attach Files to Messages) : We have always ben using "MFMessageComposeViewController" to send messages, but we weren't allowed to add images to it. But, after iOS7, you could use : " - (BOOL)addAttachmentData:(NSData *)attachmentData typeIdentifier:(NSString *)uti filename:(NSString *)filename;" this API to attach images to the messages like below : if ([MFMessageComposeViewController canSendText] && [MFMessageComposeViewController canSendAttachments] && [MFMessageComposeViewController isSupportedAttachmentUTI:( NSString *)kUTTypePNG]) {      MFMessageComposeViewController *vc = [[MFMessageComposeViewController alloc] init];      vc.messageComposeDelegate = self ;      vc.recipients = @[ @"sender" ];      UIImage *myImage = [UIImage imageNamed: @"imageName.png" ];      BOOL attached = [vc addAttachmentData:UIImagePNGRepresentation(myImage) typeIdentifier:( NSString *)kUTType