TI中文支持网
TI专业的中文技术问题搜集分享网站

疑问:dvsdk-demo的maxframerate

你好,

    在如下dm365的demo encode的实例的结构体中有如下定义:

/* Environment passed when creating the thread */
typedef struct VideoEnv {
Rendezvous_Handle hRendezvousInit;
Rendezvous_Handle hRendezvousCleanup;
Rendezvous_Handle hRendezvousWriter;
Pause_Handle hPauseProcess;
Fifo_Handle hWriterInFifo;
Fifo_Handle hWriterOutFifo;
Fifo_Handle hCaptureInFifo;
Fifo_Handle hCaptureOutFifo;
Char *videoEncoder;
Char *engineName;
Void *params;
Void *dynParams;
Int32 outBufSize;
Int videoBitRate;
Int videoFrameRate;
Int32 imageWidth;
Int32 imageHeight;
} VideoEnv;

而在另一个地方,将其设置为:

if (args->videoStd == VideoStd_D1_PAL) {
videoEnv.videoFrameRate = 25000;
} else {
videoEnv.videoFrameRate = 30000;
}

理解:videoFrameRate 应该是视频帧率,一般是25、30、60。

疑问1:此处是帧率么?

疑问2:如果是,为什么要乘以1000?

疑问3:如果是,为什么只支持30fps?不知道60fps?是驱动库不支持么?

Ternence_Hsu:

你好;

feller shi疑问1:此处是帧率么?

这是是设置编码帧率,你这个是在video.c 文件中的代码吧

feller shi疑问2:如果是,为什么要乘以1000?

这里你可以参考参数的定义

     XDAS_Int32 refFrameRate;    /**< Reference or input frame rate * 1000. */

     XDAS_Int32 targetFrameRate; /**< Target frame rate * 1000. */

feller shi疑问3:如果是,为什么只支持30fps?不知道60fps?是驱动库不支持么?

这个应当针对你采集的平台,应当采集D1的视频,也就是pal 或ntsc ,经过去隔行处理后,就是25或30 帧了

 

 

feller shi:

回复 Ternence_Hsu:

1.这是dvsdk4.02 dm365的demo中main.c的代码。

2.前几个清楚了。最后一个问题还是有些疑问:

(1)因为判断分支只判断是否PAL,是的话25000,否的话30000

(2)我们的单板,设计是要支持60fps的。

(3)demo的默认参数是720p60fps

(4)所以,感觉这里这个二选一分支,是否将帧率给设置死了么?

代码比较多,我就全部贴出来了。

(5)或者,如果我要移植一个encode、一个decode,两个是独立的,参考哪个代码示例比较好一些?:)

谢谢

Int main(Int argc, Char *argv[]){ Args args = DEFAULT_ARGS; Uns initMask = 0; Int status = EXIT_SUCCESS; Pause_Attrs pAttrs = Pause_Attrs_DEFAULT; Rendezvous_Attrs rzvAttrs = Rendezvous_Attrs_DEFAULT; Fifo_Attrs fAttrs = Fifo_Attrs_DEFAULT; UI_Attrs uiAttrs; Rendezvous_Handle hRendezvousCapStd = NULL; Rendezvous_Handle hRendezvousInit = NULL; Rendezvous_Handle hRendezvousWriter = NULL; Rendezvous_Handle hRendezvousCleanup = NULL; Pause_Handle hPauseProcess = NULL; UI_Handle hUI = NULL; struct sched_param schedParam; pthread_t captureThread; pthread_t writerThread; pthread_t videoThread; pthread_t audioThread; pthread_t speechThread; CaptureEnv captureEnv; WriterEnv writerEnv; VideoEnv videoEnv; SpeechEnv speechEnv; AudioEnv audioEnv; CtrlEnv ctrlEnv; Int numThreads; pthread_attr_t attr; Void *ret; Bool stopped;

/* Zero out the thread environments */ Dmai_clear(captureEnv); Dmai_clear(writerEnv); Dmai_clear(videoEnv); Dmai_clear(speechEnv); Dmai_clear(audioEnv); Dmai_clear(ctrlEnv);

/* Parse the arguments given to the app and set the app environment */ parseArgs(argc, argv, &args);

printf("Encode demo started.\n");

/* Launch interface app */ if (args.osd) { if (launchInterface(&args) == FAILURE) { exit(EXIT_FAILURE); } }

/* Initialize the mutex which protects the global data */ pthread_mutex_init(&gbl.mutex, NULL);

/* Set the priority of this whole process to max (requires root) */ setpriority(PRIO_PROCESS, 0, -20);

/* Initialize Codec Engine runtime */ CERuntime_init();

/* Initialize Davinci Multimedia Application Interface */ Dmai_init();

initMask |= LOGSINITIALIZED;

/* Create the user interface */ uiAttrs.osd = args.osd; uiAttrs.videoStd = args.videoStd;

hUI = UI_create(&uiAttrs);

if (hUI == NULL) { cleanup(EXIT_FAILURE); }

/* Get configuration from QT interface if necessary */ if (args.osd) { status = getConfigFromInterface(&args, hUI, &stopped); if (status == FAILURE) { ERR("Failed to get valid configuration from the GUI\n"); cleanup(EXIT_FAILURE); } else if (stopped == TRUE) { cleanup(EXIT_SUCCESS); } }

/* Validate arguments */ if (validateArgs(&args) == FAILURE) { cleanup(EXIT_FAILURE); } /* Set up the user interface */ uiSetup(hUI, &args); /* Create the Pause object */ hPauseProcess = Pause_create(&pAttrs);

if (hPauseProcess == NULL) { ERR("Failed to create Pause object\n"); cleanup(EXIT_FAILURE); }

/* Determine the number of threads needing synchronization */ numThreads = 1;

if (args.videoFile) { numThreads += 3; }

if (args.audioFile || args.speechFile) { numThreads += 1; } /* Create the objects which synchronizes the thread init and cleanup */ hRendezvousCapStd = Rendezvous_create(2, &rzvAttrs); hRendezvousInit = Rendezvous_create(numThreads, &rzvAttrs); hRendezvousCleanup = Rendezvous_create(numThreads, &rzvAttrs); hRendezvousWriter = Rendezvous_create(2, &rzvAttrs);

if (hRendezvousCapStd == NULL || hRendezvousInit == NULL || hRendezvousCleanup == NULL || hRendezvousWriter == NULL) { ERR("Failed to create Rendezvous objects\n"); cleanup(EXIT_FAILURE); }

/* Initialize the thread attributes */ if (pthread_attr_init(&attr)) { ERR("Failed to initialize thread attrs\n"); cleanup(EXIT_FAILURE); }

/* Force the thread to use custom scheduling attributes */ if (pthread_attr_setinheritsched(&attr, PTHREAD_EXPLICIT_SCHED)) { ERR("Failed to set schedule inheritance attribute\n"); cleanup(EXIT_FAILURE); }

/* Set the thread to be fifo real time scheduled */ if (pthread_attr_setschedpolicy(&attr, SCHED_FIFO)) { ERR("Failed to set FIFO scheduling policy\n"); cleanup(EXIT_FAILURE); }

/* Create the video threads if a file name is supplied */ if (args.videoFile) { /* Create the capture fifos */ captureEnv.hInFifo = Fifo_create(&fAttrs); captureEnv.hOutFifo = Fifo_create(&fAttrs);

if (captureEnv.hInFifo == NULL || captureEnv.hOutFifo == NULL) { ERR("Failed to open display fifos\n"); cleanup(EXIT_FAILURE); }

/* Create the capture thread */ captureEnv.hRendezvousInit = hRendezvousInit; captureEnv.hRendezvousCapStd = hRendezvousCapStd; captureEnv.hRendezvousCleanup = hRendezvousCleanup; captureEnv.hPauseProcess = hPauseProcess; captureEnv.videoStd = args.videoStd; captureEnv.videoInput = args.videoInput; captureEnv.imageWidth = args.imageWidth; captureEnv.imageHeight = args.imageHeight; captureEnv.previewDisabled = args.previewDisabled;

if (pthread_create(&captureThread, NULL, captureThrFxn, &captureEnv)) { ERR("Failed to create capture thread\n"); cleanup(EXIT_FAILURE); }

initMask |= CAPTURETHREADCREATED;

/* * Once the capture thread has detected the video standard, make it * available to other threads. The capture thread will set the * resolution of the buffer to encode in the environment (derived * from the video standard if the user hasn't passed a resolution). */ Rendezvous_meet(hRendezvousCapStd);

/* Create the writer fifos */ writerEnv.hInFifo = Fifo_create(&fAttrs); writerEnv.hOutFifo = Fifo_create(&fAttrs);

if (writerEnv.hInFifo == NULL || writerEnv.hOutFifo == NULL) { ERR("Failed to open display fifos\n"); cleanup(EXIT_FAILURE); }

/* Set the video thread priority */ schedParam.sched_priority = VIDEO_THREAD_PRIORITY; if (pthread_attr_setschedparam(&attr, &schedParam)) { ERR("Failed to set scheduler parameters\n"); cleanup(EXIT_FAILURE); }

/* Create the video thread */ videoEnv.hRendezvousInit = hRendezvousInit; videoEnv.hRendezvousCleanup = hRendezvousCleanup; videoEnv.hRendezvousWriter = hRendezvousWriter; videoEnv.hPauseProcess = hPauseProcess; videoEnv.hCaptureOutFifo = captureEnv.hOutFifo; videoEnv.hCaptureInFifo = captureEnv.hInFifo; videoEnv.hWriterOutFifo = writerEnv.hOutFifo; videoEnv.hWriterInFifo = writerEnv.hInFifo; videoEnv.videoEncoder = args.videoEncoder->codecName; videoEnv.params = args.videoEncoder->params; videoEnv.dynParams = args.videoEncoder->dynParams; videoEnv.videoBitRate = args.videoBitRate; videoEnv.imageWidth = captureEnv.imageWidth; videoEnv.imageHeight = captureEnv.imageHeight; videoEnv.engineName = engine->engineName; if (args.videoStd == VideoStd_D1_PAL) { videoEnv.videoFrameRate = 25000; } else { videoEnv.videoFrameRate = 30000; }

if (pthread_create(&videoThread, &attr, videoThrFxn, &videoEnv)) { ERR("Failed to create video thread\n"); cleanup(EXIT_FAILURE); }

initMask |= VIDEOTHREADCREATED;

/* * Wait for the codec to be created in the video thread before * launching the writer thread (otherwise we don't know which size * of buffers to use). */ Rendezvous_meet(hRendezvousWriter);

/* Create the writer thread */ writerEnv.hRendezvousInit = hRendezvousInit; writerEnv.hRendezvousCleanup = hRendezvousCleanup; writerEnv.hPauseProcess = hPauseProcess; writerEnv.videoFile = args.videoFile; writerEnv.outBufSize = videoEnv.outBufSize; writerEnv.writeDisabled = args.writeDisabled;

if (pthread_create(&writerThread, NULL, writerThrFxn, &writerEnv)) { ERR("Failed to create writer thread\n"); cleanup(EXIT_FAILURE); }

initMask |= WRITERTHREADCREATED;

} /* Create the audio thread if a file name is supplied */ if (args.audioFile) { /* Set the thread priority */ schedParam.sched_priority = AUDIO_THREAD_PRIORITY; if (pthread_attr_setschedparam(&attr, &schedParam)) { ERR("Failed to set scheduler parameters\n"); cleanup(EXIT_FAILURE); }

/* Create the audio thread */ audioEnv.hRendezvousInit = hRendezvousInit; audioEnv.hRendezvousCleanup = hRendezvousCleanup; audioEnv.hPauseProcess = hPauseProcess; audioEnv.engineName = engine->engineName; audioEnv.audioEncoder = args.audioEncoder->codecName; audioEnv.params = args.audioEncoder->params; audioEnv.dynParams = args.audioEncoder->dynParams; audioEnv.audioFile = args.audioFile; audioEnv.soundInput = args.soundInput; audioEnv.soundBitRate = args.soundBitRate; audioEnv.sampleRate = args.sampleRate; audioEnv.writeDisabled = args.writeDisabled; if (pthread_create(&audioThread, &attr, audioThrFxn, &audioEnv)) { ERR("Failed to create audio thread\n"); cleanup(EXIT_FAILURE); }

initMask |= AUDIOTHREADCREATED; }

/* Create the speech thread if a file name is supplied */ if (args.speechFile) { /* Set the thread priority */ schedParam.sched_priority = SPEECH_THREAD_PRIORITY; if (pthread_attr_setschedparam(&attr, &schedParam)) { ERR("Failed to set scheduler parameters\n"); cleanup(EXIT_FAILURE); }

/* Create the speech thread */ speechEnv.hRendezvousInit = hRendezvousInit; speechEnv.hRendezvousCleanup = hRendezvousCleanup; speechEnv.hPauseProcess = hPauseProcess; speechEnv.speechFile = args.speechFile; speechEnv.soundInput = args.soundInput; speechEnv.speechEncoder = args.speechEncoder->codecName; speechEnv.params = args.speechEncoder->params; speechEnv.dynParams = args.speechEncoder->dynParams; speechEnv.engineName = engine->engineName;

if (pthread_create(&speechThread, &attr, speechThrFxn, &speechEnv)) { ERR("Failed to create speech thread\n"); cleanup(EXIT_FAILURE); }

initMask |= SPEECHTHREADCREATED; }

/* Main thread becomes the control thread */ ctrlEnv.hRendezvousInit = hRendezvousInit; ctrlEnv.hRendezvousCleanup = hRendezvousCleanup; ctrlEnv.hPauseProcess = hPauseProcess; ctrlEnv.keyboard = args.keyboard; ctrlEnv.time = args.time; ctrlEnv.hUI = hUI; ctrlEnv.engineName = engine->engineName; ctrlEnv.osd = args.osd;

ret = ctrlThrFxn(&ctrlEnv);

if (ret == THREAD_FAILURE) { status = EXIT_FAILURE; }

cleanup: if (args.osd) { int rv; if (hUI) { /* Stop the UI */ UI_stop(hUI); } wait(&rv); /* Wait for child process to end */ }

/* Make sure the other threads aren't waiting for init to complete */ if (hRendezvousCapStd) Rendezvous_force(hRendezvousCapStd); if (hRendezvousWriter) Rendezvous_force(hRendezvousWriter); if (hRendezvousInit) Rendezvous_force(hRendezvousInit); if (hPauseProcess) Pause_off(hPauseProcess);

/* Wait until the other threads terminate */ if (initMask & SPEECHTHREADCREATED) { if (pthread_join(speechThread, &ret) == 0) { if (ret == THREAD_FAILURE) { status = EXIT_FAILURE; } } }

if (initMask & AUDIOTHREADCREATED) { if (pthread_join(audioThread, &ret) == 0) { if (ret == THREAD_FAILURE) { status = EXIT_FAILURE; } } } if (initMask & VIDEOTHREADCREATED) { if (pthread_join(videoThread, &ret) == 0) { if (ret == THREAD_FAILURE) { status = EXIT_FAILURE; } } }

if (initMask & WRITERTHREADCREATED) { if (pthread_join(writerThread, &ret) == 0) { if (ret == THREAD_FAILURE) { status = EXIT_FAILURE; } } }

if (writerEnv.hOutFifo) { Fifo_delete(writerEnv.hOutFifo); }

if (writerEnv.hInFifo) { Fifo_delete(writerEnv.hInFifo); }

if (initMask & CAPTURETHREADCREATED) { if (pthread_join(captureThread, &ret) == 0) { if (ret == THREAD_FAILURE) { status = EXIT_FAILURE; } } }

if (captureEnv.hOutFifo) { Fifo_delete(captureEnv.hOutFifo); }

if (captureEnv.hInFifo) { Fifo_delete(captureEnv.hInFifo); }

if (hRendezvousCleanup) { Rendezvous_delete(hRendezvousCleanup); }

if (hRendezvousInit) { Rendezvous_delete(hRendezvousInit); }

if (hPauseProcess) { Pause_delete(hPauseProcess); }

if (hUI) { UI_delete(hUI); }

/* * In the past, there were instances where we have seen system memory * continually reduces by 28 bytes at a time whenever there are file * reads or file writes. This is for the application to recapture that * memory (SDOCM00054899) */ system("sync"); system("echo 3 > /proc/sys/vm/drop_caches");

pthread_mutex_destroy(&gbl.mutex);

exit(status);}

feller shi:

回复 Ternence_Hsu:

你好,

   你的意思是这个参数只有在D1时,才使用?

   其他视频,不使用该参数?

我还没有看底层代码,当然短时间也看不了那么多。:)

赞(0)
未经允许不得转载:TI中文支持网 » 疑问:dvsdk-demo的maxframerate
分享到: 更多 (0)