鸿蒙媒体开发【相机数据采集保存】音频和视频

news2024/10/2 6:29:37

相机数据采集保存

介绍

本示例主要展示了相机的相关功能,使用libohcamera.so 接口实现相机的预览、拍照、录像、前后置摄像头切换进行拍照、录像,以及对焦、曝光等控制类功能。

效果预览

1

使用说明

  1. 弹出是否允许“CameraSample”使用相机?点击“允许”
  2. 弹出是否允许“CameraSample”使用麦克风?点击“允许”
  3. 弹出是否允许“CameraSample”访问文件?点击“允许”
  4. 弹出是否允许“CameraSample”访问图片和视频?点击“允许”
  5. 进入预览界面,预览正常,点击画面模糊处,画面会变得清晰,对焦效果明显
  6. 进入预览界面,预览正常,上下滑动屏幕,屏幕场景亮度发生变化,曝光效果明显
  7. 进入预览界面,预览正常,进入拍照模式,点击拍照按钮,拍照正常,左下角会生成照片缩略图,点击左下角缩略图,能够跳转到图库,图片保存正常,打开图片显示正常
  8. 进入预览界面,预览正常,切换到前置摄像头,点击拍照按钮,拍照正常,左下角生成照片缩略图,点击左下角缩略图,能够跳转到图库,图片保存正常,打开图片显示正常
  9. 进入预览界面,预览正常,切换到录像模式,点击录像,开始录像,再点击停止录像按钮,录像成功,左下角会生成视频缩略图,点击左下角缩略图,能够跳转到图库,录像文件保存正常,播放录像文件正常
  10. 进入预览界面,预览正常,切换到后置摄像头,点击录像,开始录像,再点击停止录像按钮,录像成功,左下角会生成视频缩略图,点击左下角缩略图,能够跳转到图库,录像文件保存正常,播放录像文件正常

具体实现

  • 相机功能接口实现在CameraManager.cpp中

    • 在NDKCamera构造函数里完成一个相机生命周期初始化的过程,包括调用OH_Camera_GetCameraMananger获取CameraMananger,调用OH_CameraManager_CreateCaptureSession创建CaptureSession,调用CaptureSessionRegisterCallback创建CaptureSession注册回调,调用GetSupportedCameras获取支持的camera设备,调用GetSupportedOutputCapability获取支持的camera设备能力集,调用CreatePreviewOutput创建预览输出,调用CreateCameraInput创建相机输入,调用CameraInputOpen打开相机输入,调用CameraManagerRegisterCallback创建CameraManager注册回调,最后调用SessionFlowFn开启Session。

    • 其中SessionFlowFn是一个开启预览的动作,主要流程包括:调用OH_CaptureSession_BeginConfig开始配置会话,调用OH_CaptureSession_AddInput把CameraInput加入到会话,调用OH_CaptureSession_AddPreviewOutput把previewOutput加入到会话,调用OH_CaptureSession_CommitConfig提交配置信息,调用OH_CaptureSession_Start开始会话工作,还有一步是在开启预览的同时调用IsFocusMode启动对焦功能,这边后面会涉及到。

    • 在NDKCamera析构函数里完成对相机生命周期释放的过程,调用OH_CameraManager_DeleteSupportedCameras删除支持的camera设备,调用OH_CameraManager_DeleteSupportedCameraOutputCapability删除支持的camera设备能力集,调用OH_Camera_DeleteCameraManager删除camera manager。

    • 拍照功能相关接口封装在StartPhoto接口中,主要包含以下流程:调用SessionStop关闭session,调用SessionBegin做session的一个预置动作,调用CreatePhotoOutput创建相机输出,调用OH_CaptureSession_AddPhotoOutput将hotoOutput添加至session中,调用SessionCommitConfig提交session,在调用SessionStart开启session,最后调用TakePicture接口开启拍照动作。

    • 录像功能相关接口封装在StartVideo接口中,主要包含以下流程:调用SessionStop关闭session,调用SessionBegin做session的一个预置动作,调用OH_CaptureSession_RemovePhotoOutput移除相机拍照输出,再调用CreatePhotoOutput创建相机输出,调用AddPhotoOutput将相机输出添加至session中,调用CreateVideoOutput创建录像输出,调用AddVideoOutput将录像输出添加至session中,然后再调用SessionCommitConfig、SessionStart对session进行提交和开启,最后调用VideoOutputRegisterCallback对VideoOutput注册回调。

    • 曝光功能相关接口封装在IsExposureModeSupportedFn接口中,主要包含以下流程:调用OH_CaptureSession_IsExposureModeSupported判断是否支持曝光模式,然后调用OH_CaptureSession_SetExposureMode设置曝光模式,调用OH_CaptureSession_GetExposureMode获取设置后的曝光模式。调用IsExposureBiasRange接口获取曝光补偿,其中包含调用OH_CaptureSession_GetExposureBiasRange获取曝光补偿的范围,调用OH_CaptureSession_SetExposureBias设置曝光点,调用OH_CaptureSession_GetExposureBias获取曝光点。

    • 对焦功能相关接口封装在IsFocusMode接口中,主要包含以下流程:调用OH_CaptureSession_IsFocusModeSupported判断是否支持对焦模式,调用OH_CaptureSession_SetFocusMode设置对焦模式,调用OH_CaptureSession_GetFocusMode获取设置后的对焦模式。

    • 调用IsFocusPoint接口获取对焦点,其中包括调用OH_CaptureSession_SetFocusPoint获取JS侧下发来的对焦点位,然后调用OH_CaptureSession_GetFocusPoint获取设置后的对焦点位。

    • 视频防抖功能相关接口封装在IsVideoStabilizationModeSupportedFn接口中,主要包含以下流程:调用OH_CaptureSession_IsVideoStabilizationModeSupported接口查询是否支持指定的视频防抖模式,调用OH_CaptureSession_SetVideoStabilizationMode设置视频防抖,调用OH_CaptureSession_GetVideoStabilizationMode获取设置后的视频防抖模式。

    • 回调接口设置:

      • CameraManagerRegisterCallback:监听相机状态回调,在打开、退出相机,相机摄像头切换时会触发
      • CameraInputRegisterCallback:相机输入发生错误时触发回调
      • PhotoOutputRegisterCallback:开启拍照时触发回调
      • VideoOutputRegisterCallback:开启录像模式时触发回调
      • CaptureSessionRegisterCallback:session出现异常时以及开启对焦模式时触发回调
  • 相机预览、拍照、录像功能、前后置切换功能实现调用侧位于tableIndex.ets,modeSwitchPage.ets,main.cpp中,源码参考:[Index.ets]

/*
 * Copyright (c) 2024 Huawei Device Co., Ltd.
 * Licensed under the Apache License, Version 2.0 (the 'License');
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an 'AS IS' BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

import { abilityAccessCtrl } from '@kit.AbilityKit';
import { common } from '@kit.AbilityKit';
import { display } from '@kit.ArkUI';
import { photoAccessHelper } from '@kit.MediaLibraryKit';
import { dataSharePredicates } from '@kit.ArkData';
import { image } from '@kit.ImageKit';
import cameraDemo from 'libentry.so';
import Logger from '../common/utils/Logger';
import { DividerPage } from '../views/DividerPage';
import { ModeSwitchPage } from '../views/ModeSwitchPage';
import { FocusPage } from '../views/FocusPage';
import { FocusAreaPage } from '../views/FocusAreaPage';
import { Constants } from '../common/Constants';
import { SettingDataObj } from '../common/Constants';
import DisplayCalculator from '../common/DisplayCalculator';

const TAG: string = 'UI indexPage';
let context = getContext(this) as common.UIAbilityContext;

@Entry
@Component
struct Index {
  // surfaceID value.
  @State surfaceId: string = '';
  // Select mode.
  @State modelBagCol: string = 'photo';
  // Exposure area.
  @State focusPointBol: boolean = false;
  // Finger click coordinates in the exposure area.
  @State focusPointVal: Array<number> = [0, 0];
  // Display where scale, focal length value, and focus box cannot coexist.
  @State exposureBol: boolean = true;
  // Exposure value.
  @State exposureNum: number = 0;
  // Countdown, photography, and video recording.
  @State countdownNum: number = 0;
  // Front and rear cameras.
  @State cameraDeviceIndex: number = 0;
  @State xComponentWidth: number = 384;
  @State xComponentHeight: number = 450;
  // Reference line.
  @State referenceLineBol: boolean = false;
  @StorageLink('defaultAspectRatio') @Watch('initXComponentSize') defaultAspectRatio: number
    = Constants.MIN_ASPECT_RATIO;
  @State onShow: boolean = false;
  // Thumbnails
  @StorageLink('thumbnail') thumbnail: image.PixelMap | undefined | string = '';
  // XComponentController.
  private mXComponentController: XComponentController = new XComponentController();
  private screenHeight: number = 0;
  private screenWidth: number = 0;
  private settingDataObj: SettingDataObj = {
    mirrorBol: false,
    videoStabilizationMode: 0,
    exposureMode: 1,
    focusMode: 2,
    photoQuality: 1,
    locationBol: false,
    photoFormat: 1,
    photoOrientation: 0,
    photoResolution: 0,
    videoResolution: 0,
    videoFrame: 0,
    referenceLineBol: false
  };
  private appContext: common.Context = getContext(this);
  atManager = abilityAccessCtrl.createAtManager();

  // Entry initialization function.
  async aboutToAppear() {
    await this.requestPermissionsFn();
    let mDisplay = display.getDefaultDisplaySync();
    this.screenWidth = px2vp(mDisplay.width);
    this.screenHeight = px2vp(mDisplay.height);
    this.initXComponentSize();
  }

  initXComponentSize(): void {
    let defaultSize =
      DisplayCalculator.calcSurfaceDisplaySize(this.screenWidth, this.screenHeight, this.defaultAspectRatio);
    this.xComponentWidth = defaultSize.width;
    this.xComponentHeight = defaultSize.height;
  }

  async aboutToDisAppear() {
    cameraDemo.releaseCamera();
  }

  // Obtain permissions.
  async requestPermissionsFn() {
    Logger.info(TAG, `requestPermissionsFn entry`);
    try {
      this.atManager.requestPermissionsFromUser(this.appContext, [
        'ohos.permission.CAMERA',
        'ohos.permission.MICROPHONE',
        'ohos.permission.READ_MEDIA',
        'ohos.permission.WRITE_MEDIA',
        'ohos.permission.WRITE_IMAGEVIDEO',
        'ohos.permission.READ_IMAGEVIDEO'
      ]).then(() => {
        Logger.info(TAG, `request Permissions success!`);
        this.onShow = true;
        this.getThumbnail();
      });
    } catch (err) {
      Logger.error(TAG, `requestPermissionsFromUser call Failed! error: ${err.code}`);
    }
  }

  async getThumbnail() {
    let phAccessHelper = photoAccessHelper.getPhotoAccessHelper(context);
    let predicates: dataSharePredicates.DataSharePredicates = new dataSharePredicates.DataSharePredicates();
    let fetchOptions: photoAccessHelper.FetchOptions = {
      fetchColumns: [],
      predicates: predicates
    };
    let fetchResult: photoAccessHelper.FetchResult<photoAccessHelper.PhotoAsset> =
      await phAccessHelper.getAssets(fetchOptions);
    let asset: photoAccessHelper.PhotoAsset = await fetchResult.getLastObject();
    console.info('asset displayName = ', asset.displayName);
    asset.getThumbnail((err, pixelMap) => {
      if (err === undefined) {
        this.thumbnail = pixelMap;
        console.info('getThumbnail successful ' + pixelMap);
      } else {
        console.error(`getThumbnail fail with error: ${err.code}, ${err.message}`);
      }
    });
  }

  async onPageShow() {
    Logger.info(TAG, `onPageShow App`);
    if (this.surfaceId && this.onShow) {
      Logger.error(TAG, `initCamera start`);
      cameraDemo.initCamera(this.surfaceId, this.settingDataObj.focusMode, this.cameraDeviceIndex);
      Logger.error(TAG, `initCamera end`);
    }
    this.getThumbnail();
  }

  onPageHide() {
    Logger.info(TAG, `onPageHide App`);
    this.thumbnail = ''
    cameraDemo.releaseCamera();
  }

  build() {
    Stack() {
      if (this.onShow) {
        // General appearance of a picture.
        XComponent({
          id: 'componentId',
          type: 'surface',
          controller: this.mXComponentController
        })
          .onLoad(async () => {
            Logger.info(TAG, 'onLoad is called');
            this.surfaceId = this.mXComponentController.getXComponentSurfaceId();
            Logger.info(TAG, `onLoad surfaceId: ${this.surfaceId}`);
            Logger.info(TAG, `initCamera start`);
            cameraDemo.initCamera(this.surfaceId, this.settingDataObj.focusMode, this.cameraDeviceIndex);
            Logger.info(TAG, `initCamera end`);
          })
          .backgroundColor(Color.Black)
          .width(Constants.FULL_PERCENT)
          .height(Constants.SEVENTY_PERCENT)
          .margin({
            bottom: Constants.FIFTEEN_PERCENT
          })

        // Reference line.
        DividerPage({ referenceLineBol: this.referenceLineBol });
        // Exposure frame and focus frame.
        FocusPage({
          focusPointBol: $focusPointBol,
          focusPointVal: $focusPointVal,
          exposureBol: $exposureBol,
          exposureNum: $exposureNum
        });
        // Exposure focusing finger click area.
        FocusAreaPage({
          focusPointBol: $focusPointBol,
          focusPointVal: $focusPointVal,
          exposureBol: $exposureBol,
          exposureNum: $exposureNum,
          xComponentWidth: this.xComponentWidth,
          xComponentHeight: this.xComponentHeight
        });
        // Reverse camera_Multiple workstations_Take photos_Video.
        ModeSwitchPage({
          surfaceId: this.surfaceId,
          cameraDeviceIndex: $cameraDeviceIndex,
          countdownNum: $countdownNum
        });
      }
    }
    .height(Constants.FULL_PERCENT)
    .width(Constants.FULL_PERCENT)
    .backgroundColor(Color.Black)
  }
}
  • 源码参考[ModeSwitchPage.ets]
/*
 * Copyright (c) 2023 Huawei Device Co., Ltd.
 * Licensed under the Apache License, Version 2.0 (the 'License');
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an 'AS IS' BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

// Reverse camera_ Multiple workstations_ Take photos Video.
import { photoAccessHelper } from '@kit.MediaLibraryKit';
import { dataSharePredicates } from '@kit.ArkData';
import { fileIo } from '@kit.CoreFileKit';
import { BusinessError, deviceInfo } from '@kit.BasicServicesKit';
import { common } from '@kit.AbilityKit';
import { image } from '@kit.ImageKit';
import { media } from '@kit.MediaKit';
import cameraDemo from 'libentry.so';
import Logger from '../common/utils/Logger';
import MediaUtils from '../common/utils/MediaUtils';
import { SettingDataObj } from '../common/Constants'
import { Constants } from '../common/Constants'

let context = getContext(this) as common.UIAbilityContext;

interface PhotoSettings {
  quality: number, // Photo quality
  rotation: number, // Photo direction
  mirror: boolean, // Mirror Enable
  latitude: number, // geographic location
  longitude: number, // geographic location
  altitude: number // geographic location
};

interface PhotoRotationMap {
  rotation0: number,
  rotation90: number,
  rotation180: number,
  rotation270: number,
};

@Component
export struct ModeSwitchPage {
  @State videoId: string = '';
  @State mSurfaceId: string = '';
  // Front and rear cameras
  @Link cameraDeviceIndex: number;
  // SurfaceID
  @Prop surfaceId: string;
  // Countdown value
  @Link countdownNum: number;
  // Countdown timer
  @State countTimerInt: number = -1;
  @State countTimerOut: number = -1;
  // Recording time
  @State videoRecodeTime: number = 0;
  // Recording time timer
  @State timer: number = -1;
  // Select mode
  @State modelBagCol: string = Constants.PHOTO;
  // Choose camera or capture
  @State @Watch('onChangeIsModeBol') isModeBol: boolean = true;
  // Thumbnails
  @StorageLink('thumbnail') thumbnail: image.PixelMap | undefined | string = '';
  private tag: string = 'sample modeSwitchPage:';
  private mediaUtil = MediaUtils.getInstance();
  private photoAsset?: string;
  private fd: number = -1;
  private cameraSize: image.Size = {
    width: 1280,
    height: 720
  };
  private photoSettings: PhotoSettings = {
    quality: 0,
    rotation: 0,
    mirror: false,
    latitude: Constants.LATITUDE,
    longitude: Constants.LONGITUDE,
    altitude: Constants.ALTITUDE
  };
  private mReceiver?: image.ImageReceiver;
  private videoRecorder?: media.AVRecorder;
  private videoConfig: media.AVRecorderConfig = {
    audioSourceType: media.AudioSourceType.AUDIO_SOURCE_TYPE_MIC,
    videoSourceType: media.VideoSourceType.VIDEO_SOURCE_TYPE_SURFACE_YUV,
    profile: {
      audioBitrate: Constants.AUDIO_BITRATE_SAMPLE_RATE,
      audioChannels: Constants.AUDIO_CHANNELS,
      audioCodec: media.CodecMimeType.AUDIO_AAC,
      audioSampleRate: Constants.AUDIO_BITRATE_SAMPLE_RATE,
      fileFormat: media.ContainerFormatType.CFT_MPEG_4,
      videoBitrate: Constants.VIDEO_BITRATE,
      videoCodec: media.CodecMimeType.VIDEO_AVC,
      videoFrameWidth: Constants.VIDEO_FRAME_WIDTH,
      videoFrameHeight: Constants.VIDEO_FRAME_HEIGHT,
      videoFrameRate: Constants.VIDEO_FRAME_RATE
    },
    url: '',
    metadata: {
      videoOrientation: ''
    }
  };
  private photoRotationMap: PhotoRotationMap = {
    rotation0: 0,
    rotation90: 90,
    rotation180: 180,
    rotation270: 270,
  };
  private settingDataObj: SettingDataObj = {
    mirrorBol: false,
    videoStabilizationMode: 0,
    exposureMode: 1,
    focusMode: 2,
    photoQuality: 1,
    locationBol: false,
    photoFormat: 1,
    photoOrientation: 0,
    photoResolution: 0,
    videoResolution: 0,
    videoFrame: 0,
    referenceLineBol: false
  };

  // After pausing, click 'stop' to reset the pause to default.
  onChangeIsModeBol() {
  }

  // Countdown capture and video.
  countTakeVideoFn() {
    if (this.countdownNum) {
      // Clear Countdown.
      if (this.countTimerOut) {
        clearTimeout(this.countTimerOut);
      }
      if (this.countTimerInt) {
        clearInterval(this.countTimerInt);
      }
      // Turn on timer.
      this.countTimerOut = setTimeout(() => {
        // Determine whether it is in video or photo mode.
        this.isVideoPhotoFn();
      }, this.countdownNum * 1000)
      // Turn on timer.
      this.countTimerInt = setInterval(() => {
        this.countdownNum--;
        if (this.countdownNum === 0) {
          clearInterval(this.countTimerInt);
        }
      }, 1000)
    } else {
      this.isVideoPhotoFn();
    }
  }

  async getVideoSurfaceID() {
    Logger.info(this.tag, `getVideoSurfaceID`);
    this.videoRecorder = await media.createAVRecorder();
    Logger.info(this.tag, `getVideoSurfaceID videoRecorder: ${this.videoRecorder}`);

    this.photoAsset = await this.mediaUtil.createAndGetUri(photoAccessHelper.PhotoType.VIDEO);
    Logger.info(this.tag, `getVideoSurfaceID photoAsset: ${this.photoAsset}`);

    this.fd = await this.mediaUtil.getFdPath(this.photoAsset);
    Logger.info(this.tag, `getVideoSurfaceID fd: ${this.fd}`);

    this.videoConfig.url = `fd://${this.fd}`;
    Logger.info(this.tag, `getVideoSurfaceID videoConfig.url : ${this.videoConfig.url}`);

    if (deviceInfo.deviceType === Constants.DEFAULT) {
      Logger.info(this.tag, `deviceType = default`);
      this.videoConfig.videoSourceType = media.VideoSourceType.VIDEO_SOURCE_TYPE_SURFACE_ES;
    }
    if (deviceInfo.deviceType === Constants.PHONE) {
      Logger.info(this.tag, `deviceType = phone`)
      this.videoConfig.videoSourceType = media.VideoSourceType.VIDEO_SOURCE_TYPE_SURFACE_YUV;
      this.videoConfig.profile.videoCodec = media.CodecMimeType.VIDEO_AVC;
      if (this.cameraDeviceIndex === 1) {
        this.videoConfig.metadata = {
          videoOrientation: '270'
        };
      } else {
        this.videoConfig.metadata = {
          videoOrientation: '90'
        };
      }
    }
    if (deviceInfo.deviceType === 'tablet') {
      Logger.info(this.tag, `deviceType = tablet`);
      this.videoConfig.videoSourceType = media.VideoSourceType.VIDEO_SOURCE_TYPE_SURFACE_YUV;
    }

    this.videoConfig.profile.videoFrameWidth = cameraDemo.getVideoFrameWidth();
    this.videoConfig.profile.videoFrameHeight = cameraDemo.getVideoFrameHeight();
    this.videoConfig.profile.videoFrameRate = cameraDemo.getVideoFrameRate();

    await this.videoRecorder.prepare(this.videoConfig);
    this.videoId = await this.videoRecorder.getInputSurface();
    Logger.info(this.tag, `getVideoSurfaceID videoId: ${this.videoId}`);
  }

  createImageReceiver() {
    try {
      this.mReceiver = image.createImageReceiver(this.cameraSize, 2000, 8);
      Logger.info(this.tag, `createImageReceiver value: ${this.mReceiver} `);
      this.mReceiver.on('imageArrival', () => {
        Logger.info(this.tag, 'imageArrival start');
        if (this.mReceiver) {
          this.mReceiver.readNextImage((err, image) => {
            Logger.info(this.tag, 'readNextImage start');
            if (err || image === undefined) {
              Logger.error(this.tag, 'readNextImage failed ');
              return;
            }
            image.getComponent(4, (errMsg, img) => {
              Logger.info(this.tag, 'getComponent start');
              if (errMsg || img === undefined) {
                Logger.info(this.tag, 'getComponent failed ');
                return;
              }
              let buffer = new ArrayBuffer(2048);
              if (img.byteBuffer) {
                buffer = img.byteBuffer;
              } else {
                Logger.error(this.tag, 'img.byteBuffer is undefined');
              }
              this.savePicture(buffer, image);
            })
          })
        }
      })
    } catch {
      Logger.info(this.tag, 'savePicture err');
    }
  }

  // Read Image.
  async savePicture(buffer: ArrayBuffer, img: image.Image) {
    try {
      Logger.info(this.tag, 'savePicture start');
      let photoAssetUri: string = await this.mediaUtil.createAndGetUri(photoAccessHelper.PhotoType.IMAGE);
      let imgPhotoUri: string = photoAssetUri;
      Logger.info(this.tag, `photoUri = ${imgPhotoUri}`);
      let imgFd = await this.mediaUtil.getFdPath(imgPhotoUri);
      Logger.info(this.tag, `fd = ${imgFd}`);
      fileIo.writeSync(imgFd, buffer);
      fileIo.closeSync(imgFd);
      await img.release();
      Logger.info(this.tag, 'save image End');
      setTimeout(() => {
        if (this.handleTakePicture) {
          this.handleTakePicture(imgPhotoUri);
        }
      }, 10)
    } catch (err) {
      Logger.info(this.tag, 'savePicture err' + JSON.stringify(err.message));
    }
  }

  async getPhotoSurfaceID() {
    if (this.mReceiver) {
      Logger.info(this.tag, 'imageReceiver has been created');
    } else {
      this.createImageReceiver();
    }
    if (this.mReceiver) {
      this.mSurfaceId = await this.mReceiver.getReceivingSurfaceId();
    }
    if (this.mSurfaceId) {
      Logger.info(this.tag, `createImageReceiver mSurfaceId: ${this.mSurfaceId} `);
    } else {
      Logger.info(this.tag, `Get mSurfaceId failed `);
    }
  }

  // Determine the video or photo mode.
  async isVideoPhotoFn() {
    await this.getPhotoSurfaceID();

    if (this.modelBagCol === Constants.PHOTO) {
      cameraDemo.startPhotoOrVideo(this.modelBagCol, this.videoId, this.mSurfaceId);
    } else if (this.modelBagCol === Constants.VIDEO) {
      this.isModeBol = false;
      if (this.timer) {
        clearInterval(this.timer);
      }
      // Start record.
      await this.getVideoSurfaceID();
      cameraDemo.startPhotoOrVideo(this.modelBagCol, this.videoId, this.mSurfaceId);
      cameraDemo.videoOutputStart();
      if (this.videoRecorder) {
        this.videoRecorder.start();
      }
    }
  }

  async handleTakePicture(thumbnail: string) {
    this.thumbnail = thumbnail
    Logger.info(this.tag, `takePicture end , thumbnail: ${this.thumbnail}`);
  }

  aboutToDisappear() {
    if (this.mReceiver) {
      this.mReceiver.release().then(() => {
        Logger.info(this.tag, 'release succeeded.');
      }).catch((error: BusinessError) => {
        Logger.error(this.tag, `release failed, error: ${error}`);
      })
    }
  }

  build() {
    if (this.isModeBol) {
      Column() {
        Text($r('app.string.photo'))
          .size({ width: $r('app.float.model_size_width'), height: $r('app.float.model_size_height') })
          .borderRadius($r('app.float.border_radius'))
          .fontSize($r('app.float.photo_video_font_size'))
          .fontColor(Color.White)
          .onClick(() => {
            cameraDemo.releaseSession()
            cameraDemo.initCamera(this.surfaceId, this.settingDataObj.focusMode, this.cameraDeviceIndex)
            this.modelBagCol = Constants.PHOTO
          })
      }.position({ x: Constants.PHOTO_X_POSITION, y: Constants.Y_POSITION })

      Column() {
        Text($r('app.string.video'))
          .size({ width: $r('app.float.model_size_width'), height: $r('app.float.model_size_height') })
          .borderRadius($r('app.float.border_radius'))
          .fontSize($r('app.float.photo_video_font_size'))
          .fontColor(Color.White)
          .onClick(() => {
            cameraDemo.releaseSession()
            cameraDemo.initCamera(this.surfaceId, this.settingDataObj.focusMode, this.cameraDeviceIndex)
            this.modelBagCol = Constants.VIDEO
          })
      }.position({ x: Constants.VIDEO_X_POSITION, y: Constants.Y_POSITION })

      // Album.
      Column() {
        Row() {
          if (this.modelBagCol === Constants.PHOTO) {
            Image(this.thumbnail || $r('app.media.camera_thumbnail_4x'))
              .borderRadius(px2vp(Constants.ICON_SIZE / 2))
              .syncLoad(true)
              .objectFit(ImageFit.Fill)
              .width(px2vp(Constants.ICON_SIZE))
              .height(px2vp(Constants.ICON_SIZE))
          } else {
            Image(this.thumbnail || $r('app.media.camera_thumbnail_4x'))
              .borderRadius(px2vp(Constants.ICON_SIZE / 2))
              .objectFit(ImageFit.Fill)
              .width(px2vp(Constants.ICON_SIZE))
              .height(px2vp(Constants.ICON_SIZE))
          }
        }
        .onClick(() => {
          if (deviceInfo.deviceType === Constants.DEFAULT) {
            context.startAbility({
              bundleName: 'com.ohos.photos',
              abilityName: 'com.ohos.photos.MainAbility'
            })
          } else if (deviceInfo.deviceType === Constants.PHONE) {
            context.startAbility({
              bundleName: 'com.huawei.hmos.photos',
              abilityName: 'com.huawei.hmos.photos.MainAbility'
            })
          }
        })
      }
      .position({ x: Constants.ALBUM_X_POSITION, y: Constants.ICON_Y_POSITION })
      .id('Thumbnail')

      // Capture video icon.
      Column() {
        Row() {
          if (this.modelBagCol === Constants.PHOTO) {
            Image($r('app.media.camera_take_photo_4x'))
              .width(px2vp(Constants.ICON_SIZE))
              .height(px2vp(Constants.ICON_SIZE))
              .onClick(() => {
                // Countdown camera recording - default camera recording.
                this.countTakeVideoFn();
              })
          } else {
            Image($r('app.media.camera_take_video_4x'))
              .width(px2vp(Constants.ICON_SIZE))
              .height(px2vp(Constants.ICON_SIZE))
              .onClick(() => {
                // Countdown camera recording - default camera recording.
                this.countTakeVideoFn();
              })
          }
        }
      }.position({ x: Constants.CAPTURE_X_POSITION, y: Constants.ICON_Y_POSITION })
      .id('CaptureOrVideoButton')

      // Front and rear camera switching.
      Column() {
        Row() {
          Image($r('app.media.camera_switch_4x'))
            .width(px2vp(Constants.ICON_SIZE))
            .height(px2vp(Constants.ICON_SIZE))
            .onClick(async () => {
              // Switching cameras.
              this.cameraDeviceIndex ? this.cameraDeviceIndex = 0 : this.cameraDeviceIndex = 1;
              // Clear configuration.
              cameraDemo.releaseSession();
              // Start preview.
              cameraDemo.initCamera(this.surfaceId, this.settingDataObj.focusMode, this.cameraDeviceIndex);
            })
        }
      }.position({ x: Constants.SWITCH_X_POSITION, y: Constants.ICON_Y_POSITION })
      .id('SwitchCameraButton')
    } else {
      Column() {
        // Video capture button.
        Image($r('app.media.camera_take_photo_4x'))
          .width(px2vp(Constants.ICON_SIZE))
          .height(px2vp(Constants.ICON_SIZE))
          .onClick(() => {
            cameraDemo.takePictureWithSettings(this.photoSettings);
          })
      }.position({ x: Constants.ALBUM_X_POSITION, y: Constants.ICON_Y_POSITION })
      .id('VideoCaptureButton')

      Column() {
        Row() {
          Column() {
            // video stop button.
            Image($r('app.media.camera_pause_video_4x'))
              .size({ width: $r('app.float.video_stop_size'), height: $r('app.float.video_stop_size') })
              .width(px2vp(Constants.ICON_SIZE))
              .height(px2vp(Constants.ICON_SIZE))
              .id('StopVideo')
              .onClick(() => {
                if (this.timer) {
                  clearInterval(this.timer);
                }
                // Stop video.
                this.stopVideo().then(() => {
                  this.videoRecodeTime = 0;
                  this.isModeBol = true;
                })
              })
          }
        }
        .width(px2vp(Constants.ICON_SIZE))
        .height(px2vp(Constants.ICON_SIZE))
      }.position({ x: Constants.CAPTURE_X_POSITION, y: Constants.ICON_Y_POSITION })
    }
  }

  async stopVideo() {
    try {
      if (this.videoRecorder) {
        await this.videoRecorder.stop();
        await this.videoRecorder.release();
      }
      cameraDemo.videoOutputStopAndRelease();
      let result: photoAccessHelper.PhotoAsset | undefined = undefined;
      if (this.photoAsset) {
        await fileIo.close(this.fd);
        setTimeout(async () => {
          let phAccessHelper = photoAccessHelper.getPhotoAccessHelper(context);
          let predicates: dataSharePredicates.DataSharePredicates = new dataSharePredicates.DataSharePredicates();
          let fetchOptions: photoAccessHelper.FetchOptions = {
            fetchColumns: [],
            predicates: predicates
          };
          let fetchResult: photoAccessHelper.FetchResult<photoAccessHelper.PhotoAsset> =
            await phAccessHelper.getAssets(fetchOptions);
          let photoAssetList: Array<photoAccessHelper.PhotoAsset> = await fetchResult.getAllObjects();
          photoAssetList.forEach((item: photoAccessHelper.PhotoAsset) => {
            if (item.uri === this.photoAsset) {
              Logger.info(this.tag, `item.uri = ${item.uri}`)
              result = item
            }
          })
          try {
            // Get video thumbnail.
            this.thumbnail = await result?.getThumbnail();
            Logger.info(this.tag, 'videoThumbnail = ' + JSON.stringify(this.thumbnail));
          } catch (err) {
            Logger.error(this.tag, 'videoThumbnail err----------:' + JSON.stringify(err.message));
          }
        }, 1000)
      }
      Logger.info(this.tag, 'stopVideo end');
    } catch (err) {
      Logger.error(this.tag, 'stopVideo err: ' + JSON.stringify(err));
    }
    return;
  }
}
  • 源码参考[main.cpp]
/*
 * Copyright (c) 2024 Huawei Device Co., Ltd.
 * Licensed under the Apache License, Version 2.0 (the 'License');
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an 'AS IS' BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

#include <hilog/log.h>
#include <js_native_api.h>
#include <node_api.h>
#include "camera_manager.h"

using namespace OHOS_CAMERA_SAMPLE;
static NDKCamera *ndkCamera_ = nullptr;
const int32_t ARGS_TWO = 2;
struct Capture_Setting {
    int32_t quality;
    int32_t rotation;
    int32_t location;
    bool mirror;
    int32_t latitude;
    int32_t longitude;
    int32_t altitude;
};

static napi_value SetZoomRatio(napi_env env, napi_callback_info info) {
    size_t argc = 2;
    napi_value args[2] = {nullptr};
    napi_value result;

    napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);

    napi_valuetype valuetype0;
    napi_typeof(env, args[0], &valuetype0);

    int32_t zoomRatio;
    napi_get_value_int32(env, args[0], &zoomRatio);

    OH_LOG_INFO(LOG_APP, "SetZoomRatio : %{public}d", zoomRatio);

    ndkCamera_->setZoomRatioFn(zoomRatio);
    napi_create_int32(env, argc, &result);
    return result;
}

static napi_value HasFlash(napi_env env, napi_callback_info info) {
    OH_LOG_INFO(LOG_APP, "HasFlash");
    size_t argc = 2;
    napi_value args[2] = {nullptr};
    napi_value result;

    napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);

    napi_valuetype valuetype0;
    napi_typeof(env, args[0], &valuetype0);

    int32_t flashMode;
    napi_get_value_int32(env, args[0], &flashMode);

    OH_LOG_INFO(LOG_APP, "HasFlash flashMode : %{public}d", flashMode);

    ndkCamera_->HasFlashFn(flashMode);
    napi_create_int32(env, argc, &result);
    return result;
}

static napi_value IsVideoStabilizationModeSupported(napi_env env, napi_callback_info info) {
    OH_LOG_INFO(LOG_APP, "IsVideoStabilizationModeSupportedFn");
    size_t argc = 2;
    napi_value args[2] = {nullptr};
    napi_value result;

    napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);

    napi_valuetype valuetype0;
    napi_typeof(env, args[0], &valuetype0);

    int32_t videoMode;
    napi_get_value_int32(env, args[0], &videoMode);

    OH_LOG_INFO(LOG_APP, "IsVideoStabilizationModeSupportedFn videoMode : %{public}d", videoMode);

    ndkCamera_->IsVideoStabilizationModeSupportedFn(videoMode);
    napi_create_int32(env, argc, &result);
    return result;
}

static napi_value InitCamera(napi_env env, napi_callback_info info) {
    OH_LOG_INFO(LOG_APP, "InitCamera Start");
    size_t argc = 3;
    napi_value args[3] = {nullptr};
    napi_value result;
    size_t typeLen = 0;
    char *surfaceId = nullptr;

    napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);

    napi_get_value_string_utf8(env, args[0], nullptr, 0, &typeLen);
    surfaceId = new char[typeLen + 1];
    napi_get_value_string_utf8(env, args[0], surfaceId, typeLen + 1, &typeLen);

    napi_valuetype valuetype1;
    napi_typeof(env, args[1], &valuetype1);

    int32_t focusMode;
    napi_get_value_int32(env, args[1], &focusMode);

    uint32_t cameraDeviceIndex;
    napi_get_value_uint32(env, args[ARGS_TWO], &cameraDeviceIndex);

    OH_LOG_INFO(LOG_APP, "InitCamera focusMode : %{public}d", focusMode);
    OH_LOG_INFO(LOG_APP, "InitCamera surfaceId : %{public}s", surfaceId);
    OH_LOG_INFO(LOG_APP, "InitCamera cameraDeviceIndex : %{public}d", cameraDeviceIndex);

    if (ndkCamera_) {
        OH_LOG_INFO(LOG_APP, "ndkCamera_ is not null");
        delete ndkCamera_;
        ndkCamera_ = nullptr;
    }
    ndkCamera_ = new NDKCamera(surfaceId, focusMode, cameraDeviceIndex);
    OH_LOG_INFO(LOG_APP, "InitCamera End");
    napi_create_int32(env, argc, &result);
    return result;
}

static napi_value ReleaseCamera(napi_env env, napi_callback_info info) {
    OH_LOG_INFO(LOG_APP, "ReleaseCamera Start");
    size_t argc = 2;
    napi_value args[2] = {nullptr};
    napi_value result;

    napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);

    ndkCamera_->ReleaseCamera();
    if (ndkCamera_) {
        OH_LOG_INFO(LOG_APP, "ndkCamera_ is not null");
        delete ndkCamera_;
        ndkCamera_ = nullptr;
    }
    OH_LOG_INFO(LOG_APP, "ReleaseCamera End");
    napi_create_int32(env, argc, &result);
    return result;
}
static napi_value ReleaseSession(napi_env env, napi_callback_info info) {
    OH_LOG_INFO(LOG_APP, "ReleaseCamera Start");
    size_t argc = 2;
    napi_value args[2] = {nullptr};
    napi_value result;

    napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);

    ndkCamera_->ReleaseSession();

    OH_LOG_INFO(LOG_APP, "ReleaseCamera End");
    napi_create_int32(env, argc, &result);
    return result;
}
static napi_value StartPhotoOrVideo(napi_env env, napi_callback_info info) {
    OH_LOG_INFO(LOG_APP, "StartPhotoOrVideo Start");
    Camera_ErrorCode ret = CAMERA_OK;
    size_t argc = 3;
    napi_value args[3] = {nullptr};
    napi_value result;
    size_t typeLen = 0;
    size_t videoIdLen = 0;
    size_t photoIdLen = 0;
    char *modeFlag = nullptr;
    char *videoId = nullptr;
    char *photoId = nullptr;

    napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);

    napi_get_value_string_utf8(env, args[0], nullptr, 0, &typeLen);
    modeFlag = new char[typeLen + 1];
    napi_get_value_string_utf8(env, args[0], modeFlag, typeLen + 1, &typeLen);

    napi_get_value_string_utf8(env, args[1], nullptr, 0, &videoIdLen);
    videoId = new char[videoIdLen + 1];
    napi_get_value_string_utf8(env, args[1], videoId, videoIdLen + 1, &videoIdLen);

    napi_get_value_string_utf8(env, args[ARGS_TWO], nullptr, 0, &photoIdLen);
    photoId = new char[photoIdLen + 1];
    napi_get_value_string_utf8(env, args[ARGS_TWO], photoId, photoIdLen + 1, &photoIdLen);

    if (!strcmp(modeFlag, "photo")) {
        OH_LOG_INFO(LOG_APP, "StartPhoto surfaceId %{public}s", photoId);
        ret = ndkCamera_->StartPhoto(photoId);
    } else if (!strcmp(modeFlag, "video")) {
        ret = ndkCamera_->StartVideo(videoId, photoId);
        OH_LOG_INFO(LOG_APP, "StartPhotoOrVideo %{public}s, %{public}s", videoId, photoId);
    }
    napi_create_int32(env, ret, &result);
    return result;
}

static napi_value VideoOutputStart(napi_env env, napi_callback_info info) {
    if (info == nullptr) {
        OH_LOG_ERROR(LOG_APP, "Info is nullptr.");
    }
    OH_LOG_INFO(LOG_APP, "VideoOutputStart Start");
    napi_value result;
    Camera_ErrorCode ret = ndkCamera_->VideoOutputStart();
    napi_create_int32(env, ret, &result);
    return result;
}

static napi_value IsExposureModeSupported(napi_env env, napi_callback_info info) {
    OH_LOG_INFO(LOG_APP, "IsExposureModeSupported exposureMode start.");
    size_t argc = 2;
    napi_value args[2] = {nullptr};
    napi_value result;

    napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);

    napi_valuetype valuetype0;
    napi_typeof(env, args[0], &valuetype0);

    int32_t exposureMode;
    napi_get_value_int32(env, args[0], &exposureMode);

    OH_LOG_INFO(LOG_APP, "IsExposureModeSupported exposureMode : %{public}d", exposureMode);

    ndkCamera_->IsExposureModeSupportedFn(exposureMode);
    OH_LOG_INFO(LOG_APP, "IsExposureModeSupported exposureMode end.");
    napi_create_int32(env, argc, &result);
    return result;
}

static napi_value IsMeteringPoint(napi_env env, napi_callback_info info) {
    size_t argc = 2;
    napi_value args[2] = {nullptr};
    napi_value result;

    napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);

    napi_valuetype valuetype0;
    napi_typeof(env, args[0], &valuetype0);
    int x;
    napi_get_value_int32(env, args[0], &x);

    napi_typeof(env, args[0], &valuetype0);
    int y;
    napi_get_value_int32(env, args[1], &y);
    ndkCamera_->IsMeteringPoint(x, y);
    napi_create_int32(env, argc, &result);
    return result;
}

static napi_value IsExposureBiasRange(napi_env env, napi_callback_info info) {
    OH_LOG_INFO(LOG_APP, "IsExposureBiasRange start.");
    size_t argc = 2;
    napi_value args[2] = {nullptr};
    napi_value result;

    napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);

    napi_valuetype valuetype0;
    napi_typeof(env, args[0], &valuetype0);

    int exposureBiasValue;
    napi_get_value_int32(env, args[0], &exposureBiasValue);
    ndkCamera_->IsExposureBiasRange(exposureBiasValue);
    OH_LOG_INFO(LOG_APP, "IsExposureBiasRange end.");
    napi_create_int32(env, argc, &result);
    return result;
}

static napi_value IsFocusModeSupported(napi_env env, napi_callback_info info) {
    OH_LOG_INFO(LOG_APP, "IsFocusModeSupported start.");
    size_t argc = 2;
    napi_value args[2] = {nullptr};
    napi_value result;

    napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);

    napi_valuetype valuetype0;
    napi_typeof(env, args[0], &valuetype0);

    int32_t focusMode;
    napi_get_value_int32(env, args[0], &focusMode);

    OH_LOG_INFO(LOG_APP, "IsFocusModeSupportedFn videoMode : %{public}d", focusMode);

    ndkCamera_->IsFocusModeSupported(focusMode);
    OH_LOG_INFO(LOG_APP, "IsFocusModeSupported end.");
    napi_create_int32(env, argc, &result);
    return result;
}

static napi_value IsFocusPoint(napi_env env, napi_callback_info info) {
    size_t argc = 2;
    napi_value args[2] = {nullptr};
    napi_value result;

    napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);

    napi_valuetype valuetype0;
    napi_typeof(env, args[0], &valuetype0);
    double x;
    napi_get_value_double(env, args[0], &x);

    napi_valuetype valuetype1;
    napi_typeof(env, args[1], &valuetype1);
    double y;
    napi_get_value_double(env, args[1], &y);

    float focusPointX = static_cast<float>(x);
    float focusPointY = static_cast<float>(y);
    ndkCamera_->IsFocusPoint(focusPointX, focusPointY);
    napi_create_int32(env, argc, &result);
    return result;
}

static napi_value GetVideoFrameWidth(napi_env env, napi_callback_info info) {
    OH_LOG_INFO(LOG_APP, "GetVideoFrameWidth Start");
    size_t argc = 1;
    napi_value args[1] = {nullptr};
    napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);

    napi_value result = nullptr;
    napi_create_int32(env, ndkCamera_->GetVideoFrameWidth(), &result);

    OH_LOG_INFO(LOG_APP, "GetVideoFrameWidth End");
    return result;
}

static napi_value GetVideoFrameHeight(napi_env env, napi_callback_info info) {
    OH_LOG_INFO(LOG_APP, "GetVideoFrameHeight Start");
    size_t argc = 1;
    napi_value args[1] = {nullptr};
    napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);

    napi_value result = nullptr;
    napi_create_int32(env, ndkCamera_->GetVideoFrameHeight(), &result);

    OH_LOG_INFO(LOG_APP, "GetVideoFrameHeight End");
    return result;
}

static napi_value GetVideoFrameRate(napi_env env, napi_callback_info info) {
    OH_LOG_INFO(LOG_APP, "GetVideoFrameRate Start");
    size_t argc = 1;
    napi_value args[1] = {nullptr};
    napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);

    napi_value result = nullptr;
    napi_create_int32(env, ndkCamera_->GetVideoFrameRate(), &result);

    OH_LOG_INFO(LOG_APP, "GetVideoFrameRate End");
    return result;
}

static napi_value VideoOutputStopAndRelease(napi_env env, napi_callback_info info) {
    OH_LOG_INFO(LOG_APP, "VideoOutputStopAndRelease Start");
    size_t argc = 1;
    napi_value args[1] = {nullptr};
    napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);

    napi_value result = nullptr;
    ndkCamera_->VideoOutputStop();
    ndkCamera_->VideoOutputRelease();

    OH_LOG_INFO(LOG_APP, "VideoOutputStopAndRelease End");
    napi_create_int32(env, argc, &result);
    return result;
}

static napi_value TakePicture(napi_env env, napi_callback_info info) {
    if (info == nullptr) {
        OH_LOG_ERROR(LOG_APP, "Info is nullptr.");
    }
    OH_LOG_INFO(LOG_APP, "TakePicture Start");
    napi_value result;
    Camera_ErrorCode ret = ndkCamera_->TakePicture();
    OH_LOG_INFO(LOG_APP, "TakePicture result is %{public}d", ret);
    napi_create_int32(env, ret, &result);
    return result;
}

static napi_value GetCaptureParam(napi_env env, napi_value captureConfigValue, Capture_Setting *config) {
    napi_value value = nullptr;
    napi_get_named_property(env, captureConfigValue, "quality", &value);
    napi_get_value_int32(env, value, &config->quality);

    napi_get_named_property(env, captureConfigValue, "rotation", &value);
    napi_get_value_int32(env, value, &config->rotation);

    napi_get_named_property(env, captureConfigValue, "mirror", &value);
    napi_get_value_bool(env, value, &config->mirror);

    napi_get_named_property(env, captureConfigValue, "latitude", &value);
    napi_get_value_int32(env, value, &config->latitude);

    napi_get_named_property(env, captureConfigValue, "longitude", &value);
    napi_get_value_int32(env, value, &config->longitude);

    napi_get_named_property(env, captureConfigValue, "altitude", &value);
    napi_get_value_int32(env, value, &config->altitude);

    OH_LOG_INFO(LOG_APP,
                "get quality %{public}d, rotation %{public}d, mirror %{public}d, latitude "
                "%{public}d, longitude %{public}d, altitude %{public}d",
                config->quality, config->rotation, config->mirror, config->latitude, config->longitude,
                config->altitude);
    return 0;
}
static void SetConfig(Capture_Setting settings, Camera_PhotoCaptureSetting *photoSetting, Camera_Location *location) {
    if (photoSetting == nullptr || location == nullptr) {
        OH_LOG_INFO(LOG_APP, "photoSetting is null");
    }
    photoSetting->quality = static_cast<Camera_QualityLevel>(settings.quality);
    photoSetting->rotation = static_cast<Camera_ImageRotation>(settings.rotation);
    photoSetting->mirror = settings.mirror;
    location->altitude = settings.altitude;
    location->latitude = settings.latitude;
    location->longitude = settings.longitude;
    photoSetting->location = location;
}

static napi_value TakePictureWithSettings(napi_env env, napi_callback_info info) {
    OH_LOG_INFO(LOG_APP, "TakePictureWithSettings Start");
    size_t argc = 1;
    napi_value args[1] = {nullptr};
    Camera_PhotoCaptureSetting photoSetting;
    Capture_Setting setting_inner;
    Camera_Location *location = new Camera_Location;

    napi_get_cb_info(env, info, &argc, args, nullptr, nullptr);
    GetCaptureParam(env, args[0], &setting_inner);
    SetConfig(setting_inner, &photoSetting, location);

    napi_value result;
    Camera_ErrorCode ret = ndkCamera_->TakePictureWithPhotoSettings(photoSetting);
    OH_LOG_INFO(LOG_APP, "TakePictureWithSettings result is %{public}d", ret);
    napi_create_int32(env, ret, &result);
    return result;
}

EXTERN_C_START
static napi_value Init(napi_env env, napi_value exports) {
    napi_property_descriptor desc[] = {
        {"initCamera", nullptr, InitCamera, nullptr, nullptr, nullptr, napi_default, nullptr},
        {"startPhotoOrVideo", nullptr, StartPhotoOrVideo, nullptr, nullptr, nullptr, napi_default, nullptr},
        {"videoOutputStart", nullptr, VideoOutputStart, nullptr, nullptr, nullptr, napi_default, nullptr},
        {"setZoomRatio", nullptr, SetZoomRatio, nullptr, nullptr, nullptr, napi_default, nullptr},
        {"hasFlash", nullptr, HasFlash, nullptr, nullptr, nullptr, napi_default, nullptr},
        {"isVideoStabilizationModeSupported", nullptr, IsVideoStabilizationModeSupported, nullptr, nullptr, nullptr,
         napi_default, nullptr},
        {"isExposureModeSupported", nullptr, IsExposureModeSupported, nullptr, nullptr, nullptr, napi_default, nullptr},
        {"isMeteringPoint", nullptr, IsMeteringPoint, nullptr, nullptr, nullptr, napi_default, nullptr},
        {"isExposureBiasRange", nullptr, IsExposureBiasRange, nullptr, nullptr, nullptr, napi_default, nullptr},
        {"IsFocusModeSupported", nullptr, IsFocusModeSupported, nullptr, nullptr, nullptr, napi_default, nullptr},
        {"isFocusPoint", nullptr, IsFocusPoint, nullptr, nullptr, nullptr, napi_default, nullptr},
        {"getVideoFrameWidth", nullptr, GetVideoFrameWidth, nullptr, nullptr, nullptr, napi_default, nullptr},
        {"getVideoFrameHeight", nullptr, GetVideoFrameHeight, nullptr, nullptr, nullptr, napi_default, nullptr},
        {"getVideoFrameRate", nullptr, GetVideoFrameRate, nullptr, nullptr, nullptr, napi_default, nullptr},
        {"videoOutputStopAndRelease", nullptr, VideoOutputStopAndRelease, nullptr, nullptr, nullptr, napi_default,
         nullptr},
        {"takePicture", nullptr, TakePicture, nullptr, nullptr, nullptr, napi_default, nullptr},
        {"takePictureWithSettings", nullptr, TakePictureWithSettings, nullptr, nullptr, nullptr, napi_default, nullptr},
        {"releaseSession", nullptr, ReleaseSession, nullptr, nullptr, nullptr, napi_default, nullptr},
        {"releaseCamera", nullptr, ReleaseCamera, nullptr, nullptr, nullptr, napi_default, nullptr}};
    napi_define_properties(env, exports, sizeof(desc) / sizeof(desc[0]), desc);
    return exports;
}
EXTERN_C_END

static napi_module demoModule = {
    .nm_version = 1,
    .nm_flags = 0,
    .nm_filename = nullptr,
    .nm_register_func = Init,
    .nm_modname = "entry",
    .nm_priv = ((void *)0),
    .reserved = {0},
};

extern "C" __attribute__((constructor)) void RegisterEntryModule(void) { napi_module_register(&demoModule); }
  • 预览:开启预览位于Index.ets下的onPageShow接口,其中调用cameraDemo.initCamera接口,将预览的surfaceId,对焦模式的值,以及是前置还是后置摄像头设备作为入参啊传下去,实际调用的是main.cpp下的InitCamera接口,InitCamera接口将JS侧拿到的参数进行转换再传入cameraManager.cpp中的构造函数里去,完成开启相机的操作,开启预览并设置好对焦模式。

    • 拍照和录像:开启拍照位于ModeSwitchPage.ets下的isVideoPhotoFn接口,通过判断modelBagCol的值是photo还是video,将modelBagCol的值,videoId,拍照的surfaceID或者录像的surfaceId传入接口startPhotoOrVideo。如果是拍照模式,则通过modeSwitchPage.ets下的getPhotoSurfaceID接口获取photo surfaceId,跳转到main.cpp中的StartPhotoOrVideo接口,将传下来的参数进行格式转换,再调用CameraManager对象下的StartPhoto接口开启拍照操作;如果是录像模式,则通过modeSwitchPage.ets下的getVideoSurfaceID接口获取video surfaceId,跳转到main.cpp中的StartPhotoOrVideo接口,将传下来的参数进行格式转换,再调用CameraManager对象下的StartVideo接口开启录像操作
    • 前后置切换:前后置摄像头切换接口位于ModeSwitchPage.ets,切换cameraDeviceIndex,将先前的session配置释放,调用cameraDemo.releaseSession接口,实际上是main.cpp下的ReleaseSession接口,最终调用到CameraMangaer.cpp下的ReleaseSession接口。然后将预览的surfaceId,对焦模式的值以及cameraDeviceIndex传入cameraDemo.initCamera接口中,逻辑和预览一致。
  • 相机对焦、曝光功能实现调用侧位于FocusAreaPage.ets中,源码参考:[FocusAreaPage.ets]

/*
 * Copyright (c) 2024 Huawei Device Co., Ltd.
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *     http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

import cameraDemo from 'libentry.so';
import Logger from '../common/utils/Logger';
import { Constants } from '../common/Constants'

const TAG: string = 'FocusAreaPage';

// Focus Area.
@Component
export struct FocusAreaPage {
  @Link focusPointBol: boolean;
  @Link focusPointVal: Array<number>;
  // Display where scale, focal length value, and focus box cannot coexist.
  @Link exposureBol: boolean;
  // Exposure value.
  @Link exposureNum: number;
  @Prop xComponentWidth: number;
  @Prop xComponentHeight: number;
  // Focusing area display box timer.
  private areaTimer: number = -1;
  // Sliding Exposure Up and Down.
  private panOption: PanGestureOptions = new PanGestureOptions({
    direction: PanDirection.Up | PanDirection.Down,
    fingers: 1
  });

  build() {
    Row() {
    }
    .width(Constants.FULL_PERCENT)
    .height(Constants.SEVENTY_PERCENT)
    .margin({
      bottom: Constants.FIFTEEN_PERCENT
    })
    .opacity(1)
    .id('FocusArea')
    .onTouch((e: TouchEvent) => {
      if (e.type === TouchType.Down) {
        this.focusPointBol = true;
        this.focusPointVal[0] = e.touches[0].windowX;
        this.focusPointVal[1] = e.touches[0].windowY;
        // Focus point.
        cameraDemo.isFocusPoint(
          e.touches[0].windowX / this.xComponentWidth,
          e.touches[0].windowY / this.xComponentHeight
        );
        cameraDemo.isMeteringPoint(
          e.touches[0].windowX / this.xComponentWidth,
          e.touches[0].windowY / this.xComponentHeight + 50
        );
      }
      if (e.type === TouchType.Up) {
        if (this.areaTimer) {
          clearTimeout(this.areaTimer);
        }
        this.areaTimer = setTimeout(() => {
          this.focusPointBol = false;
        }, 3500);
      }
    })
    // Trigger this gesture event by dragging vertically with one finger.
    .gesture(
      PanGesture(this.panOption)
        .onActionStart(() => {
          Logger.info(TAG, 'PanGesture onActionStart');
          this.exposureBol = false;
        })
        .onActionUpdate((event: GestureEvent) => {
          let offset = -event.offsetY;
          if (offset > Constants.EVENT_Y_OFFSET) {
            this.exposureNum = 4;
          }
          if (offset < Constants.EVENT_Y_OFFSET1) {
            this.exposureNum = -4;
          }
          if (offset > Constants.EVENT_Y_OFFSET1 && offset < Constants.EVENT_Y_OFFSET) {
            this.exposureNum = Number((offset / 50).toFixed(1));
          }
          // Exposure Compensation -4 +4.
          cameraDemo.isExposureBiasRange(this.exposureNum);
          Logger.info(TAG, `PanGesture onActionUpdate offset: ${offset}, exposureNum: ${this.exposureNum}`);
        })
        .onActionEnd(() => {
          this.exposureNum = 0;
          this.exposureBol = true;
          Logger.info(TAG, 'PanGesture onActionEnd end');
        })
    )
  }
}
  • 对焦:对焦功能位于FocusAreaPage.ets,通过在build中将对焦焦点下发到cpp侧,在CameraManager.cpp文件中的SessionFlowFn函数中,会调用IsFocusMode接口来判断是否支持对焦模式,然后通过onTouch的方式将对焦点位通过cameraDemo.isFocusPoint接口下发到main.cpp侧的IsFocusPoint接口,最终调到CameraManager.cpp中的IsFocusPoint接口。以及调用OH_CaptureSession_SetFocusMode拿到对焦点位来设置对焦模式,最后调用OH_CaptureSession_GetFocusMode来获取对焦模式,完成对焦功能实现。
  • 曝光:曝光功能位于FocusAreaPage.ets,通过在build中将侧光点位下发到cpp侧,然后通过onTouch的方式将对焦点位以及侧光点位通过cameraDemo.isFocusPoint接口下发到main.cpp侧的isMeteringPoint接口,最终调到CameraManager.cpp中的IsMeteringPoint接口。然后设置曝光补偿,单指竖直方向拖动触发该手势事件,调用gesture中的cameraDemo.isExposureBiasRange接口将曝光值下发到main.cpp中的IsExposureBiasRange,然后经过napi转换后将值传到CameraManager.cpp中的IsExposureBiasRange接口,之后从native侧发到曝光补偿的范围,再调用OH_CaptureSession_SetExposureBias设置曝光值,最后调用OH_CaptureSession_GetExposureBias接口获取曝光值,完成曝光功能。

以上就是本篇文章所带来的鸿蒙开发中一小部分技术讲解;想要学习完整的鸿蒙全栈技术。可以在结尾找我可全部拿到!
下面是鸿蒙的完整学习路线,展示如下:
1

除此之外,根据这个学习鸿蒙全栈学习路线,也附带一整套完整的学习【文档+视频】,内容包含如下

内容包含了:(ArkTS、ArkUI、Stage模型、多端部署、分布式应用开发、音频、视频、WebGL、OpenHarmony多媒体技术、Napi组件、OpenHarmony内核、鸿蒙南向开发、鸿蒙项目实战)等技术知识点。帮助大家在学习鸿蒙路上快速成长!

鸿蒙【北向应用开发+南向系统层开发】文档

鸿蒙【基础+实战项目】视频

鸿蒙面经

在这里插入图片描述

为了避免大家在学习过程中产生更多的时间成本,对比我把以上内容全部放在了↓↓↓想要的可以自拿喔!谢谢大家观看!

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.coloradmin.cn/o/1979338.html

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈,一经查实,立即删除!

相关文章

CentOS上面的MySQL安装~~~保姆级教程

目录 0.声明 1.下载官网包包 2.新建文件夹&#xff0c;把rpm拖拽进来 3.安装yum源&#xff0c;查看前后变化 4.安装mysql服务 5.查看是否安装成功 6.出现报错的解决方案 7.启动MySQL 8.查看启动服务 9.配置文件 10.重新运行 11.免密码登录 12.再谈配置文件 0.声明…

【Unity】3D功能开发入门系列(四)

Unity3D功能开发入门系列&#xff08;四&#xff09; 一、组件的访问&#xff08;一&#xff09;组件的调用&#xff08;二&#xff09;组件的参数&#xff08;三&#xff09;引用别的组件&#xff08;四&#xff09;引用脚本组件&#xff08;五&#xff09;消息调用 二、物体的…

A*搜索算法 双向A*搜索算法 Julia实现

算法概述 抽象的在非负有向边权图 G ( V , E , W ) , W : E → R G (V, E, W), W: E \to \mathbb{R} G(V,E,W),W:E→R 上的 BFS 过程可以看作这样&#xff1a; (1) 设 C C C 点集表示已遍历的点&#xff0c; ∀ n ∈ C , d ( n ) \forall n \in C, d(n) ∀n∈C,d(n) 表示…

Leetcode75- 种花问题

间隔种花 也就是 0 0 0 或者开头 0 0 结尾 0 0 也就是这三个地方可以种花 然后分别判断 最后根据提交结果分析漏掉的情况 比如 n为0 和 数组长度只有一个的情况 使用枚举可以很快解题

技术男的审美反击:UI配置化新纪元

之前常常被甲方的领导说&#xff0c;我们全是一群钢铁直男&#xff0c;一点不懂审美&#xff0c;其实我们心里边想的 “您说得对啊&#xff01;&#xff01;&#xff01;&#xff01;” 这个可能和理工科有关系吧&#xff0c;理工男好像都差不多&#xff0c;所以这次我们就把很…

Vue的学习(二)

目录 一、class及style的绑定 1.v-bind:class绑定类名 绑定class为对象 ​编辑2. v-bind:class绑定类名 绑定class为对象 3.v-bind:class绑定类名 绑定class为数组 1) v-bind:class绑定类名 绑定class为数组 方法一&#xff1a; 2) v-bind:class绑定类名 绑定class为数组…

实验4-2-1 求e的近似值

//实验4-2-1 求e的近似值 /* 自然常数 e 可以用级数 11/1!1/2!⋯1/n!⋯ 来近似计算。 本题要求对给定的非负整数 n&#xff0c;求该级数的前 n1 项和。 输入格式:输入第一行中给出非负整数 n&#xff08;≤1000&#xff09;。 输出格式:在一行中输出部分和的值&#xff0c;保留…

nginx: [error] open() “/run/nginx.pid“ failed (2: No such file or directory)

今天 准备访问下Nginx服务&#xff0c;但是 启动时出现如下报错&#xff1a;&#xff08;80端口被占用&#xff0c;没有找到nginx.pid文件&#xff09; 解决思路&#xff1a; 1、 查看下排查下nginx服务 #确认下nginx状态 ps -ef|grep nginx systemctl status nginx#查看端口…

数据结构——时间和空间复杂度

目录 一、时间复杂度和空间复杂度是什么 二、为什么要有时间复杂度和空间复杂度 三、时间复杂度 四、空间复杂度 一、时间复杂度和空间复杂度是什么 在生活中&#xff0c;我们做一件事情需要花费一定的时间和一定的空间&#xff0c;举一个例子&#xff1a; 一个工厂需要制…

从根儿上学习spring 十一 之run方法启动第四段(5)

图15-AbstractAutowireCapableBeanFactory#doCreateBean方法 我们接着讲doCreateBean方法&#xff0c;之前对循环依赖做了些解释&#xff0c;我们接着往下看populateBean(beanName, mbd, instanceWrapper)方法 图15-572行 这行就是调用populateBean(beanName, mbd, instanceW…

目标检测——YOLOv10: Real-Time End-to-End Object Detection

YOLOv10是在YOLOv8的基础上&#xff0c;借鉴了RT-DETR的一些创新点改进出来的 标题&#xff1a;YOLOv10: Real-Time End-to-End Object Detection论文&#xff1a;https://arxiv.org/pdf/2405.14458源码&#xff1a;https://github.com/THU-MIG/yolov10 1. 论文介绍 在过去的几…

【C语言】详解feof函数和ferror函数

文章目录 前言1. feof1.1 feof函数原型1.2 正确利用函数特性读写文件1.2.1 针对文本文件1.2.2 针对二进制文件 1.3 feof函数的原理1.4 feof函数实例演示 2. ferror2.1 ferror函数原型 前言 或许我们曾在网络上看过有关于feof函数&#xff0c;都说这个函数是检查文件是否已经读…

Windows系统使用内网穿透配置Mysql公网地址实现IDEA远程连接

文章目录 前言1. 本地连接测试2. Windows安装Cpolar3. 配置Mysql公网地址4. IDEA远程连接Mysql5. 固定连接公网地址6. 固定地址连接测试 前言 IDEA作为Java开发最主力的工具&#xff0c;在开发过程中需要经常用到数据库&#xff0c;如Mysql数据库&#xff0c;但是在IDEA中只能…

【Python学习手册(第四版)】学习笔记15-文档(注释、PyDoc等)

个人总结难免疏漏&#xff0c;请多包涵。更多内容请查看原文。本文以及学习笔记系列仅用于个人学习、研究交流。 本文主要介绍程序的文档概念。包括为程序编写的注释&#xff0c;以及内置工具的文档。讲解文档字符串、Python的在线手册等资源、以及PyDoc的help函数和网页接口。…

蒙电通无人机航线规划系统 雷达点云电力应用软件

蒙电通无人机航线规划系统&#xff0c;它可进行标记杆塔、切档、自动对点云数据分类和点云抽稀等处理&#xff0c;按3张或6张照片自动生成航线&#xff0c;或按自定义航线模型生成航线&#xff0c;支持安全性检测。在满足当地巡检标准的前提下&#xff0c;系统操作非常简便。 …

llama神经网络的结构,llama-3-8b.layers=32 llama-3-70b.layers=80; 2000汉字举例说明

目录 llama-3-8b.layers=32 llama-3-70b.layers=80 llama神经网络的结构 Llama神经网络结构示例 示例中的输入输出大小 实际举例说明2000个汉字文本数据集 初始化词嵌入矩阵 1. 输入层 2. 嵌入层 3. 卷积层 4. 全连接层 llama-3-8b.layers=32 llama-3-70b.laye…

跑深度学习模型Ⅲ:正确安装与torch版本对应的其他torch包

pytorch的正确安装可以回看我前面的博客跑深度学习模型Ⅱ&#xff1a;一文安装正确pytorch及dgl-CSDN博客 这篇博客将安装torch_grometric&#xff0c;torch_scatter, torch_sparse, torch_cluster库 1. 查看自己的torch版本 进入cmd 切换到要用的python环境中&#xff0c;输…

ADB Installer 0 file(s)copied

在为泡面神器刷安卓&#xff0c;做准备工作装ADB时报错了&#xff0c;以下是报错提示 再用cmd命令adb version验证下&#xff0c;提示adb不是有效命令&#xff0c;百分百安装失败了&#xff0c;往上各种搜索查询均没有对症的&#xff0c;其中也尝试了安装更新版本的&#xff0c…

2024版本IDEA创建Servlet模板

IDEA 版本 2024.1.4 新版本的 IDEA 需要自己创建 Servlet 模板 旧版本 IDEA 看我这篇文章&#xff1a;解决IDEA的Web项目右键无法创建Servlet问题_2024idea无法创建servlet项目-CSDN博客文章浏览阅读216次&#xff0c;点赞7次&#xff0c;收藏3次。解决IDEA的Web项目右键无法创…

AGI思考探究的意义、价值与乐趣 Ⅴ

搞清楚模型对知识或模式的学习与迁移对于泛化意味什么&#xff0c;或者说两者间的本质&#xff1f;相信大家对泛化性作为大语言模型LLM的突出能力已经非常了解了 - 这也是当前LLM体现出令人惊叹的通用与涌现能力的基础前提&#xff0c;这里不再过多赘述&#xff0c;但仍希望大家…