ShareGrid

Get photo EXIF ​​information using Swift script

What’s EXIF

EXIF, the full name of which is Exchangeable Image File Format, is a standard for recording additional information about photos. It is mainly used in image files taken by digital cameras and mobile phones (such as JPEG and HEIF).

Simply put, EXIF is the “metadata” of a photo, which records various parameters when it was taken.

Common EXIF ​​information

TypeExample
EquipmentCamera brand (Canon, Nikon), model
ParametersShutter speed, aperture, ISO, focal length
TimeShooting time (accurate to seconds)
LocationGPS Coordinates (if location tracking is enabled on the device)
Image InformationResolution, orientation (rotation angle)
Software InformationThe name and version number of the software used for post-editing

EXIF usage

  • Photographic analysis:Photographers can analyze photo parameters and optimize shooting techniques.
  • Automatic rotation:The system or website can display the photo correctly based on the orientation information in EXIF.
  • Map markers:If GPS information is available, the photo can be located on a map.
  • Copyright tracing:Some images may contain information about the author.

Why do I need EXIF

As an “amateur photographer”, I hope my photos can show some shooting parameter information, which will be more convenient for people who want to understand them.

Such as

SONY ILCE-7CM2TAMRON E 28-200mm F2.8-5.6 Di III A071ISO 25099mmƒ6.31/3200 s

Apple iPhone 15 Pro MaxTelephoto Camera — 120 mm ƒ2.8ISO 50120mmƒ2.81/1271 sHDR

How to read EXIF

As a front-end developer, I vaguely remember the npm packages for reading exif: exifreader and exif-js. Although I am used to writing in JavaScript, JavaScript is limited by its operating environment and has limited capabilities. For example, the file types that exifreader supports reading are as follows:

File typeExifIPTCXMPICCMPFPhotoshopMakerNoteThumbnailImage details
JPEGyesyesyesyesyessome*some**yesyes
TIFFyesyesyesyes???some*some**N/AN/A
PNGyesyesyesyes??????some**noyes
HEIC/HEIFyesnoyesyes??????some**yesno
AVIFyesnoyesyes??????some**yesno
WebPyesnoyesyes??????some**yesyes
GIFN/AN/AN/AN/AN/AN/AN/AN/Ayes

How could a photographer only have these few photo file types? Turning to Swift naturally solves these troubles. After all, MacOS is the OS that natively supports the most multimedia file types, no one else.

readExif.swift
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
#!/usr/bin/env swift

import AppKit
import CoreGraphics
import CoreServices
import Foundation
import ImageIO

func checkImageIsHDR(for fileURL: URL) -> Bool {
guard let imageSource = CGImageSourceCreateWithURL(fileURL as CFURL, nil) else {
print("Unable to read image")
return false
}

guard
let imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, nil)
as? [CFString: Any]
else {
print("Unable to get image properties")
return false
}

if let profileName = imageProperties[kCGImagePropertyProfileName] as? String {
print("ICC Profile Name: \(profileName)")
if profileName.contains("PQ") || profileName.contains("BT.2100") {
print("ColorSpace is PQ")
return true
}
}

// Check if GainMap exists
if let auxData = CGImageSourceCopyAuxiliaryDataInfoAtIndex(
imageSource, 0, kCGImageAuxiliaryDataTypeHDRGainMap)
{
print("HDR Gain Map data detected: \(auxData)")
return true
}
return false
}

func readExifDescription(from fileURL: URL) -> String? {
guard let imageSource = CGImageSourceCreateWithURL(fileURL as CFURL, nil) else {
print("Unable to read file: \(fileURL.path)")
return nil
}

guard
let properties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, nil) as? [CFString: Any]
else {
print("Unable to extract metadata")
return nil
}

let tiff = properties[kCGImagePropertyTIFFDictionary] as? [CFString: Any]
let exif = properties[kCGImagePropertyExifDictionary] as? [CFString: Any]
// Get the Apple-specific metadata dictionary

// Use officially supported fields
let originalLens = exif?[kCGImagePropertyExifLensModel] as? String

let make = (tiff?[kCGImagePropertyTIFFMake] as? String ?? "").trimmingCharacters(
in: .whitespaces)
let model = (tiff?[kCGImagePropertyTIFFModel] as? String ?? "").trimmingCharacters(
in: .whitespaces)

let iso = exif?[kCGImagePropertyExifISOSpeedRatings] as? [Int] ?? []
let focalLength = exif?[kCGImagePropertyExifFocalLength] as? Double
let aperture = exif?[kCGImagePropertyExifFNumber] as? Double
let shutterSpeedValue = exif?[kCGImagePropertyExifExposureTime] as? Double
let exposureBias = exif?[kCGImagePropertyExifExposureBiasValue] as? Double

// Define lens replacement map
let lensReplacementMap: [String: String] = [
"iPhone 15 Pro Max back triple camera 6.765mm f/1.78": "Main Camera — 24 mm ƒ1.78",
// Add more replacements here:
// "Original Lens Model String": "Desired Replacement String",
"iPhone 15 Pro Max back triple camera 9.03mm f/2.8": "Telephoto Camera — 77 mm ƒ2.8",
"iPhone 15 Pro Max back triple camera 2.22mm f/2.2": "Ultra Wide Camera — 13 mm ƒ2.2",
]

// Apply replacement if found, otherwise use the original lens string
let lens = originalLens.flatMap { lensReplacementMap[$0] } ?? originalLens

// Check for HDR
let isHDR: Bool = checkImageIsHDR(for: fileURL)

// Constructing output format
var parts: [String] = []
if !make.isEmpty || !model.isEmpty {
parts.append("\(make) \(model)")
}
if let isoValue = iso.first {
parts.append("ISO \(isoValue)")
}
if let fl = focalLength {
parts.append("\(Int(round(fl)))mm")
}
if let ap = aperture {
parts.append(\(String(format: "%.1f", ap))")
}
if let ss = shutterSpeedValue {
if ss >= 1.0 {
parts.append("\(Int(ss)) s")
} else {
let denominator = Int(round(1.0 / ss))
parts.append("1/\(denominator) s")
}
}
if let ev = exposureBias {
if abs(ev) < 0.001 {
parts.append("")
} else {
parts.append("\(String(format: "%+.1f", ev))ev")
}
}
if let lens = lens {
parts.append("\(lens)")
}
// Append HDR tag if the image is HDR
if isHDR {
parts.append("HDR")
}

return parts.map { "{\($0)}" }.joined()
}

// Main program entry
let args = CommandLine.arguments
guard args.count >= 2 else {
print("用法: readExif.swift <Image Path>")
exit(1)
}

let filePath = args[1]
let fileURL = URL(fileURLWithPath: filePath)

if let description = readExifDescription(from: fileURL) {
print(description)
// copy to clipboard
let pasteboard = NSPasteboard.general
pasteboard.clearContents()
pasteboard.setString(description, forType: .string)
} else {
print("Read failed")
}

How to use:

  1. Add executable permissions to the script

    1
    chmod +x readExif.swift
  2. Run

    1
    readExif.swift ~/Downloads/testImage.jpg

Example output: {Apple iPhone 15 Pro Max}{ISO 100}{7mm}{ƒ1.8}{1/22222 s}{}{Main Camera — 24 mm ƒ1.78}

Why is it in this order and format? Because I designed the display order in this blog theme. The lens parameters are at the end because I wanted to add them later.

Lens name conversion

The lens focal length read out by EXIF ​​is converted into the real focal length of the 35mm frame, which seems very unintuitive, so we need to do some mapping so that the returned result is the same as that displayed in Photos.

The parameters of the three lenses of iPhone 15 Pro Max correspond to:

1
2
3
4
5
{
"iPhone 15 Pro Max back triple camera 6.765mm f/1.78": "Main Camera — 24 mm ƒ1.78",
"iPhone 15 Pro Max back triple camera 9.03mm f/2.8": "Telephoto Camera — 77 mm ƒ2.8",
"iPhone 15 Pro Max back triple camera 2.22mm f/2.2": "Ultra Wide Camera — 13 mm ƒ2.2"
}