20

I have two strings representing latitude and longitude like: "-56.6462520", and i want to assign then to a CLLocation object to compare to my current location.I tried the following code but i get errors only:

CLLocation * LocationAtual = [[CLLocation alloc]init];
LocationAtual.coordinate.latitude = @"-56.6462520";
LocationAtual.coordinate.longitude = @"-36.6462520";

and then compare the object with my actual location latitude and longitude. Any suggestions?

Nagarjun
  • 6,557
  • 5
  • 33
  • 51
Vinicius Albino
  • 743
  • 2
  • 8
  • 21

5 Answers5

109

You cannot assign directly into coordinate - it is a readonly property of CLLocation.
Use the following instance method:

- (instancetype)initWithLatitude:(CLLocationDegrees)latitude
                       longitude:(CLLocationDegrees)longitude

example:

CLLocation *LocationAtual = [[CLLocation alloc] initWithLatitude:-56.6462520 longitude:-36.6462520];
AmitP
  • 5,353
  • 4
  • 35
  • 27
12

I think you need to:

LocationAtual.coordinate.latitude = [@"-56.6462520" floatValue];
LocationAtual.coordinate.longitude = [@"-36.6462520" floatValue];
Robot Woods
  • 5,677
  • 2
  • 21
  • 30
9

Swift Answer for the lazy:

var LocationAtual: CLLocation = CLLocation(latitude: -56.6462520, longitude: -36.6462520)
Paul Lehn
  • 3,202
  • 1
  • 24
  • 29
1

CLLocation coordinate is actually a read only value

    @property (nonatomic, readonly) CLLocationCoordinate2D coordinate;

So the best way to assign dummy data to coordinates is AMITp way

abc123
  • 11
  • 1
1

lattitude and longitude are double values so need to be assigned this way.

CLLocation *LocationAtual=[[CLLocation alloc] initWithLatitude:[[location objectForKey:@"latitude"] doubleValue] longitude:[[location objectForKey:@"longitude"] doubleValue]]