Getting started with Camera in React Native

Kyle Buntin
5 min readJul 18, 2021

Ever wondered how to capture images and videos in react native?

At Quick Component, we have mobile templates where the cool camera feature has been implemented and in this tutorial, We will be implementing the basic features of the Camera component from expo-camera.
We will taking advantage of the Video component from expo-av. this is needed to video.

Install our dependencies.

Run command:

expo install expo-camera expo-av

Simply paste the code below in your App.js file and see your camera come alive.

import React, { useState, useRef, useEffect } from "react";
import {
View,
Text,
TouchableOpacity,
SafeAreaView,
StyleSheet,
Dimensions,
} from "react-native";
import { Camera } from "expo-camera";
import { Video } from "expo-av";
export default function CameraScreen() {
const [hasPermission, setHasPermission] = useState(null);
const [cameraType, setCameraType] = useState(Camera.Constants.Type.back);
const [isPreview, setIsPreview] = useState(false);
const [isCameraReady, setIsCameraReady] = useState(false);
const [isVideoRecording, setIsVideoRecording] = useState(false);
const [videoSource, setVideoSource] = useState(null);
const cameraRef = useRef(); useEffect(() => {
(async () => {
const { status } = await Camera.requestPermissionsAsync();
setHasPermission(status === "granted");
})();
}, []);
const onCameraReady = () => {
setIsCameraReady(true);
};
const takePicture = async () => {
if (cameraRef.current) {
const options = { quality: 0.5, base64: true, skipProcessing: true };
const data = await cameraRef.current.takePictureAsync(options);
const source = data.uri;
if (source) {
await cameraRef.current.pausePreview();
setIsPreview(true);
console.log("picture", source);
}
}
};
const recordVideo = async () => {
if (cameraRef.current) {
try {
const videoRecordPromise = cameraRef.current.recordAsync();
if (videoRecordPromise) {
setIsVideoRecording(true);
const data = await videoRecordPromise;
const source = data.uri;
if (source) {
setIsPreview(true);
console.log("video source", source);
setVideoSource(source);
}
}
} catch (error) {
console.warn(error);
}
}
};
const stopVideoRecording = () => {
if (cameraRef.current) {
setIsPreview(false);
setIsVideoRecording(false);
cameraRef.current.stopRecording();
}
};
const switchCamera = () => {
if (isPreview) {
return;
}
setCameraType((prevCameraType) =>
prevCameraType === Camera.Constants.Type.back
? Camera.Constants.Type.front
: Camera.Constants.Type.back
);
};
const cancelPreview = async () => {
await cameraRef.current.resumePreview();
setIsPreview(false);
setVideoSource(null);
};
const renderCancelPreviewButton = () => (
<TouchableOpacity onPress={cancelPreview} style={styles.closeButton}>
<View style={[styles.closeCross, { transform: [{ rotate: "45deg" }] }]} />
<View
style={[styles.closeCross, { transform: [{ rotate: "-45deg" }] }]}
/>
</TouchableOpacity>
);
const renderVideoPlayer = () => (
<Video
source={{ uri: videoSource }}
shouldPlay={true}
style={styles.media}
/>
);
const renderVideoRecordIndicator = () => (
<View style={styles.recordIndicatorContainer}>
<View style={styles.recordDot} />
<Text style={styles.recordTitle}>{"Recording..."}</Text>
</View>
);
const renderCaptureControl = () => (
<View style={styles.control}>
<TouchableOpacity disabled={!isCameraReady} onPress={switchCamera}>
<Text style={styles.text}>{"Flip"}</Text>
</TouchableOpacity>
<TouchableOpacity
activeOpacity={0.7}
disabled={!isCameraReady}
onLongPress={recordVideo}
onPressOut={stopVideoRecording}
onPress={takePicture}
style={styles.capture}
/>
</View>
);
if (hasPermission === null) {
return <View />;
}
if (hasPermission === false) {
return <Text style={styles.text}>No access to camera</Text>;
}
return (
<SafeAreaView style={styles.container}>
<Camera
ref={cameraRef}
style={styles.container}
type={cameraType}
flashMode={Camera.Constants.FlashMode.on}
onCameraReady={onCameraReady}
onMountError={(error) => {
console.log("camera error", error);
}}
/>
<View style={styles.container}>
{isVideoRecording && renderVideoRecordIndicator()}
{videoSource && renderVideoPlayer()}
{isPreview && renderCancelPreviewButton()}
{!videoSource && !isPreview && renderCaptureControl()}
</View>
</SafeAreaView>
);
}
const WINDOW_HEIGHT = Dimensions.get("window").height;const closeButtonSize = Math.floor(WINDOW_HEIGHT * 0.032);
const captureSize = Math.floor(WINDOW_HEIGHT * 0.09);
const styles = StyleSheet.create({
container: {
...StyleSheet.absoluteFillObject,
},
closeButton: {
position: "absolute",
top: 35,
left: 15,
height: closeButtonSize,
width: closeButtonSize,
borderRadius: Math.floor(closeButtonSize / 2),
justifyContent: "center",
alignItems: "center",
backgroundColor: "#c4c5c4",
opacity: 0.7,
zIndex: 2,
},
media: {
...StyleSheet.absoluteFillObject,
},
closeCross: {
width: "68%",
height: 1,
backgroundColor: "black",
},
control: {
position: "absolute",
flexDirection: "row",
bottom: 38,
width: "100%",
alignItems: "center",
justifyContent: "center",
},
capture: {
backgroundColor: "#f5f6f5",
borderRadius: 5,
height: captureSize,
width: captureSize,
borderRadius: Math.floor(captureSize / 2),
marginHorizontal: 31,
},
recordIndicatorContainer: {
flexDirection: "row",
position: "absolute",
top: 25,
alignSelf: "center",
justifyContent: "center",
alignItems: "center",
backgroundColor: "transparent",
opacity: 0.7,
},
recordTitle: {
fontSize: 14,
color: "#ffffff",
textAlign: "center",
},
recordDot: {
borderRadius: 3,
height: 6,
width: 6,
backgroundColor: "#ff0000",
marginHorizontal: 5,
},
text: {
color: "#fff",
},
});

Let’s examine the code now: To access anative app api like the Camera, we definitely need to request permission and this is what we have done on line 19.

While permission is currently requested awaiting a user response, we render an empty view on line 122. And if permission was denied by the user, on line 126, we render a simple text saying “No access to camera”.

Now that we have permissions to access the Camera, you should get familiar with the ref props on line 132, in the Camera component. There we have passed the cameraRef that was previously defined with useRef. In doing so, we will have access to interesting methods that we can call to control the Camera.

Quickly see the cameraRef in action on line 30, where we have called the takePictureAsyn method on cameraRef in this way “cameraRef.current.takePictureAsync()” Also, On line 41, where we called the recordAsync method in this way “cameraRef.current.recordAsync()”.

Explaining the methods in this file,

  • onCameraReady: This method is called to set the isCameraReady state to true. In doing so, we guard against crash and use the camera only when it is ready. capturePicture: This simply takes a picture using the takePictureAsync method on cameraRef. recordVideo: This is called when we need to record a video
  • stopVideoRecording: This is called when we need to stop recording a video.
  • switchCamera: This simply enable us to with between front and back camera.
  • cancelPreview: This resumes a previous preview of picture taken.

Explaining the component methods in this file,

The renderCancelPreviewButton, renderVideoPlayer, renderVideoRecordIndicator, and renderCameraControl simply render a component for display when called. When we have no video source and we are not previewing, we renderCameraControl. When we have a videoSource, we renderVideoPlayer. When isRecordingVideo is true, we renderVideoRecordIndicator.

And when isPreviewing is true, we renderCancelPreviewButton.

We are now done with the basic implementation of the camera component. From this tutorial, you can see how easy it really is to a camera feature to your app. You can go further and improve on this example by paying around with other APIs from the official documentation here.

Check out some cool mobile templates at Quick Component. This mobile templates is fully functional and comes with a backend (Firebase) configured, saving you quality hours of development time.

This article was first published a here by me

--

--