Core::Core(int _xRes, int _yRes, bool fullScreen, bool vSync, int aaLevel, int anisotropyLevel, int frameRate, int monitorIndex) : EventDispatcher() { int _hz; getScreenInfo(&defaultScreenWidth, &defaultScreenHeight, &_hz); services = CoreServices::getInstance(); input = new CoreInput(); services->setCore(this); fps = 0; timeLeftOver = 0.0; running = true; frames = 0; lastFrameTicks=0; lastFPSTicks=0; elapsed = 0; xRes = _xRes; yRes = _yRes; paused = false; pauseOnLoseFocus = false; if (fullScreen && !xRes && !yRes) { getScreenInfo(&xRes, &yRes, NULL); } mouseEnabled = true; mouseCaptured = false; lastSleepFrameTicks = 0; this->monitorIndex = monitorIndex; if(frameRate == 0) frameRate = 60; setFramerate(frameRate); threadedEventMutex = NULL; }
FrameCapture::FrameCapture(CaptureDevicePtr existingCaptureDevice, float framerate) :captureDevice(existingCaptureDevice) { if(captureDevice == 0) // if the user did not provide an empty pointer, construct a capturing device captureDevice = new CaptureDevice(0); setFramerate(framerate); }
void TTMpeg2MainWnd::showFrame( uint f_index ) { TTPicturesHeader* current_picture; TTSequenceHeader* current_sequence; try { frameInfo->setFrameOffset( header_list->at(index_list->headerListIndex(f_index))->headerOffset()); current_picture = (TTPicturesHeader*)header_list->at(index_list->headerListIndex(f_index)); //frameInfo->setNumDisplayOrder( index_list->displayOrder(f_index)); frameInfo->setNumDisplayOrder( index_list->displayOrder(f_index), current_picture->temporal_reference); frameInfo->setNumStreamOrder( index_list->streamOrder(f_index)); //frameInfo->setNumStreamOrder( index_list->streamOrder(f_index), // current_picture->temporal_reference); frameInfo->setFrameType( index_list->videoIndexAt(f_index)->picture_coding_type ); frameInfo->setFrameTime( ttFramesToTime(index_list->displayOrder(f_index),25.0), total_time); frameInfo->setFrameSize( (long)index_list->frameSize( f_index ) ); //frameInfo->setGOPNumber( index_list->gopNumber( f_index ) ); frameInfo->setGOPNumber( index_list->gopNumber( f_index ), current_picture->temporal_reference); //current_sequence = header_list->sequenceHeaderAt( index_list->sequenceIndex( f_index ) ); current_sequence = video_stream->currentSequenceHeader(); setBitrate( current_sequence->bit_rate_value ); setFramerate( current_sequence->frame_rate_code ); setPictureWidth( current_sequence->horizontal_size_value ); setPictureHeight( current_sequence->vertical_size_value ); mpeg2_window->moveToVideoFrame( f_index ); } catch(TTStreamEOFException) { qDebug("stream EOF exception..."); } catch(TTListIndexException) { qDebug("index list exception..."); } catch(...) { qDebug("unknown exception..."); } }
void SPHrender::initializeGL() { glClearColor(0.0, 0.0, 0.0, 1.0); glEnable(GL_DEPTH_TEST); glMatrixMode(GL_PROJECTION); glLoadIdentity(); glOrtho(-0.5, 0.5, -0.5, 0.5, 0.1, 10.0); setFramerate(fps); sph.start(); }
void DisplayEngine::initRender() { m_NumFrames = 0; m_FramesTooLate = 0; m_TimeSpentWaiting = 0; m_StartTime = TimeSource::get()->getCurrentMicrosecs(); m_LastFrameTime = m_StartTime; m_bInitialized = true; if (m_VBRate != 0) { setVBlankRate(m_VBRate); } else { setFramerate(m_Framerate); } }
WindowManager::WindowManager(uint32_t w, uint32_t h, const char *title) { setFramerate(30); if(-1 == SDL_Init(SDL_INIT_VIDEO)) { throw std::runtime_error("Unable to initialize SDL"); } if(!SDL_SetVideoMode(w, h, 32, SDL_OPENGL)) { throw std::runtime_error("Unable to open a window"); } SDL_WM_SetCaption(title, 0); GLenum error = glewInit(); if(error != GLEW_OK) { throw std::runtime_error("Unable to init GLEW: " + std::string((const char*) glewGetErrorString(error))); } }
void FrameGrabberDSA::conditionalBreakpoint() { // Grabber thread blocks while paused boost::mutex::scoped_lock pause_lock(paused_mutex); // Disable Push Mode if(paused && !request_single_frames) { ts->SetFramerateRetries(0, true, false, 3, true); // Disable DSA push-mode } while(paused) { // Loop to catch spurious wake-ups paused_changed.wait(pause_lock); // wait() unlocks automatically } // Prepare to receive tactile sensor frame (Push or Pull Mode) setFramerate(framerate); // Restore previous framerate }
Animation::Animation(Window *window, std::string filename, int width, int height, int start_frame, int frame_count, int framerate): Sprite(window, filename, width, height), window(window), times_played(0), running(true), play_count(0), frametime(0), current_frame(0), current_frametime(0) { setFramerate(framerate); for (int frame = 0; frame < frame_count; ++frame) { frames.push_back(frame); } }
Animation::Animation(const utils::Configuration& info, const SDL_Rect& parent) : Drawable(info, parent) { try { setFramerate(std::stoi(info.get( info::Framerate ))); // Create animation clips const int nbr_clips = std::stoi(info.get( info::NbrClips )); int texture_width = 0; int texture_height = 0; SDL_QueryTexture(m_texture, NULL, NULL, &texture_width, &texture_height); const auto clip_width = texture_width / nbr_clips; m_texture_clips.reserve( nbr_clips ); for(int i = 0; i < nbr_clips; ++i) m_texture_clips.push_back( SDL_Rect{clip_width * i, 0, clip_width, texture_height} ); } catch(const std::exception& e) { LOG(ERROR) << "Bad animation info file: " << e.what(); } }
void FrameGrabberDSA::start(double frameRate, bool startPaused, bool startRecording) { framerate = frameRate; frame_interval = 1.0 / framerate; paused = startPaused; // Initialize DSA Controller if(startPaused) { ts->SetFramerateRetries(0, true, false, 3, true); // Disable DSA push-mode } else { // Prepare to receive tactile sensor frame (Push or Pull Mode) setFramerate(framerate); } if(startRecording) { enableRecording(); } current_time.StoreNow(); last_time = current_time; last_time_temperature = current_time; grabber_thread = boost::thread(boost::bind(&FrameGrabberDSA::execute, this)); }
void TTMpeg2MainWnd::setInitialValues() { // buffer statistic setFileLength( 0 ); setReadOps( 0 ); setFillOps( 0 ); laReadTime->setText( "--:--:--" ); // stream statistics setNumFramesTotal( 0 ); setNumIFrames( 0 ); setNumPFrames( 0 ); setNumBFrames( 0 ); setNumSequence( 0 ); setNumGOP( 0 ); setNumPicture( 0 ); setNumSequenceEnd( 0 ); // sequence info setBitrate( 0 ); setFramerate( 0.0 ); setPictureWidth( 0 ); setPictureHeight( 0 ); }
int main(int argc, char *argv[]) { struct { void *start; size_t length; } *vbuffers, *cbuffers; struct v4l2_capability capability; struct v4l2_format cformat, vformat; struct v4l2_requestbuffers creqbuf, vreqbuf; struct v4l2_buffer cfilledbuffer, vfilledbuffer; int vfd, cfd; int i, ret, count = -1, memtype = V4L2_MEMORY_USERPTR; int index = 1, vid = 1, set_video_img = 0; int device = 1; char *pixelFmt; int framerate = 30; int req_width = 0, req_height = 0; if ((argc > 1) && (!strcmp(argv[1], "?"))) { usage(); return 0; } if (argc > index) { device = atoi(argv[index]); index++; } cfd = open_cam_device(O_RDWR, device); if (cfd <= 0) { printf("Could not open the cam device\n"); return -1; } if (argc > index) { framerate = atoi(argv[index]); if (framerate == 0) { printf("Invalid framerate value, Using Default " "framerate = 15\n"); framerate = 15; } index++; } if (argc > index) { pixelFmt = argv[index]; index++; if (argc <= index) { printf("Setting QCIF as video size, default value\n"); ret = cam_ioctl(cfd, pixelFmt, DEFAULT_VIDEO_SIZE); if (ret < 0) return -1; } ret = validateSize(argv[index]); if (ret == 0) { ret = cam_ioctl(cfd, pixelFmt, argv[index]); if (ret < 0) { usage(); return -1; } } else { index++; if (argc <= index) { printf("Invalid size\n"); usage(); return -1; } ret = cam_ioctl(cfd, pixelFmt, argv[index-1], argv[index]); if (ret < 0) { usage(); return -1; } req_width = atoi(argv[index-1]); req_height = atoi(argv[index]); } index++; } else { printf("Setting pixel format and video size with default " "values\n"); ret = cam_ioctl(cfd, DEFAULT_PIXEL_FMT, DEFAULT_VIDEO_SIZE); if (ret < 0) return -1; } ret = setFramerate(cfd, framerate); if (ret < 0) { printf("Error setting framerate = %d\n", framerate); return -1; } if (argc > index) { vid = atoi(argv[index]); if ((vid != 1) && (vid != 2)) { printf("vid has to be 1 or 2!\n "); return 0; } index++; } if (argc > index) count = atoi(argv[index]); printf("Frames: %d\n", count); index++; if (count >= 1000 || count <= 0) count = -1; /************************************************************/ /* Special DSS setup for 720p */ if (req_width == 1280 && req_height == 720) { ret = dss_setup_720p(vid); if (ret != 0) return ret; } /************************************************************/ vfd = open((vid == 1) ? VIDEO_DEVICE1 : VIDEO_DEVICE2, O_RDWR); if (vfd <= 0) { printf("Could no open the device %s\n", (vid == 1) ? VIDEO_DEVICE1 : VIDEO_DEVICE2); vid = 0; } else printf("openned %s for rendering\n", (vid == 1) ? VIDEO_DEVICE1 : VIDEO_DEVICE2); if (ioctl(vfd, VIDIOC_QUERYCAP, &capability) == -1) { perror("dss VIDIOC_QUERYCAP"); return -1; } if (capability.capabilities & V4L2_CAP_STREAMING) printf("The video driver is capable of Streaming!\n"); else { printf("The video driver is not capable of " "Streaming!\n"); return -1; } if (ioctl(cfd, VIDIOC_QUERYCAP, &capability) < 0) { perror("cam VIDIOC_QUERYCAP"); return -1; } if (capability.capabilities & V4L2_CAP_STREAMING) printf("The driver is capable of Streaming!\n"); else { printf("The driver is not capable of Streaming!\n"); return -1; } cformat.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; ret = ioctl(cfd, VIDIOC_G_FMT, &cformat); if (ret < 0) { perror("cam VIDIOC_G_FMT"); return -1; } printf("Camera Image width = %d, Image height = %d, size = %d\n", cformat.fmt.pix.width, cformat.fmt.pix.height, cformat.fmt.pix.sizeimage); vformat.type = V4L2_BUF_TYPE_VIDEO_OUTPUT; ret = ioctl(vfd, VIDIOC_G_FMT, &vformat); if (ret < 0) { perror("dss VIDIOC_G_FMT"); return -1; } printf("Video Image width = %d, Image height = %d, size = %d\n", vformat.fmt.pix.width, vformat.fmt.pix.height, vformat.fmt.pix.sizeimage); if ((cformat.fmt.pix.width != vformat.fmt.pix.width) || (cformat.fmt.pix.height != vformat.fmt.pix.height) || (cformat.fmt.pix.sizeimage != vformat.fmt.pix.sizeimage)) { printf("image sizes don't match!\n"); set_video_img = 1; } if (cformat.fmt.pix.pixelformat != vformat.fmt.pix.pixelformat) { printf("pixel formats don't match!\n"); set_video_img = 1; } if (set_video_img) { printf("set video image the same as camera image ..\n"); if (cformat.fmt.pix.width != (cformat.fmt.pix.bytesperline/2)) vformat.fmt.pix.width = cformat.fmt.pix.bytesperline/2; else vformat.fmt.pix.width = cformat.fmt.pix.width; vformat.fmt.pix.height = cformat.fmt.pix.height; vformat.fmt.pix.sizeimage = cformat.fmt.pix.sizeimage; vformat.fmt.pix.pixelformat = cformat.fmt.pix.pixelformat; ret = ioctl(vfd, VIDIOC_S_FMT, &vformat); if (ret < 0) { perror("dss VIDIOC_S_FMT"); return -1; } printf("New Image & Video sizes, after " "equaling:\nCamera Image width = %d, " "Image height = %d, size = %d\n", cformat.fmt.pix.width, cformat.fmt.pix.height, cformat.fmt.pix.sizeimage); printf("Video Image width = %d, Image height " "= %d, size = %d\n", vformat.fmt.pix.width, vformat.fmt.pix.height, vformat.fmt.pix.sizeimage); if (cformat.fmt.pix.pixelformat != vformat.fmt.pix.pixelformat) { printf("can't make camera and video image " "compatible!\n"); return 0; } } vreqbuf.type = V4L2_BUF_TYPE_VIDEO_OUTPUT; vreqbuf.memory = V4L2_MEMORY_MMAP; vreqbuf.count = 4; if (ioctl(vfd, VIDIOC_REQBUFS, &vreqbuf) == -1) { perror("dss VIDEO_REQBUFS"); return; } printf("Video Driver allocated %d buffers when 4 are " "requested\n", vreqbuf.count); vbuffers = calloc(vreqbuf.count, sizeof(*vbuffers)); for (i = 0; i < vreqbuf.count; ++i) { struct v4l2_buffer buffer; buffer.type = vreqbuf.type; buffer.index = i; if (ioctl(vfd, VIDIOC_QUERYBUF, &buffer) == -1) { perror("dss VIDIOC_QUERYBUF"); return; } vbuffers[i].length = buffer.length; vbuffers[i].start = mmap(NULL, buffer.length, PROT_READ | PROT_WRITE, MAP_SHARED, vfd, buffer.m.offset); if (vbuffers[i].start == MAP_FAILED) { perror("dss mmap"); return; } printf("Video Buffers[%d].start = %x length = %d\n", i, vbuffers[i].start, vbuffers[i].length); } creqbuf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; creqbuf.memory = memtype; creqbuf.count = 4; printf("Requesting %d buffers of type %s\n", creqbuf.count, (memtype == V4L2_MEMORY_USERPTR) ? "V4L2_MEMORY_USERPTR" : "V4L2_MEMORY_MMAP"); if (ioctl(cfd, VIDIOC_REQBUFS, &creqbuf) < 0) { perror("cam VIDEO_REQBUFS"); return -1; } printf("Camera Driver allowed buffers reqbuf.count = %d\n", creqbuf.count); cbuffers = calloc(creqbuf.count, sizeof(*cbuffers)); /* mmap driver memory or allocate user memory, and queue each buffer */ for (i = 0; i < creqbuf.count; ++i) { struct v4l2_buffer buffer; buffer.type = creqbuf.type; buffer.memory = creqbuf.memory; buffer.index = i; if (ioctl(cfd, VIDIOC_QUERYBUF, &buffer) < 0) { perror("cam VIDIOC_QUERYBUF"); return -1; } if (memtype == V4L2_MEMORY_USERPTR) { buffer.flags = 0; buffer.m.userptr = (unsigned int)vbuffers[i].start; buffer.length = vbuffers[i].length; } else { cbuffers[i].length = buffer.length; cbuffers[i].start = vbuffers[i].start; printf("Mapped Buffers[%d].start = %x length = %d\n", i, cbuffers[i].start, cbuffers[i].length); buffer.m.userptr = (unsigned int)cbuffers[i].start; buffer.length = cbuffers[i].length; } if (ioctl(cfd, VIDIOC_QBUF, &buffer) < 0) { perror("cam VIDIOC_QBUF"); return -1; } } show_sensor_info(cfd); /* turn on streaming */ if (ioctl(cfd, VIDIOC_STREAMON, &creqbuf.type) < 0) { perror("cam VIDIOC_STREAMON"); return -1; } /* caputure 1000 frames or when we hit the passed nmuber of frames */ cfilledbuffer.type = creqbuf.type; vfilledbuffer.type = vreqbuf.type; i = 0; while (i < 1000) { /* De-queue the next avaliable buffer */ while (ioctl(cfd, VIDIOC_DQBUF, &cfilledbuffer) < 0) perror("cam VIDIOC_DQBUF"); vfilledbuffer.index = cfilledbuffer.index; vfilledbuffer.type = V4L2_BUF_TYPE_VIDEO_OUTPUT; vfilledbuffer.memory = V4L2_MEMORY_MMAP; vfilledbuffer.m.userptr = (unsigned int)(vbuffers[cfilledbuffer.index].start); vfilledbuffer.length = cfilledbuffer.length; if (ioctl(vfd, VIDIOC_QBUF, &vfilledbuffer) < 0) { perror("dss VIDIOC_QBUF"); return -1; } i++; if (i == 3) { /* Turn on streaming for video */ if (ioctl(vfd, VIDIOC_STREAMON, &vreqbuf.type)) { perror("dss VIDIOC_STREAMON"); return -1; } } if (i >= 3) { /* De-queue the previous buffer from video driver */ if (ioctl(vfd, VIDIOC_DQBUF, &vfilledbuffer)) { perror("dss VIDIOC_DQBUF"); return; } } if (i == count) { printf("Cancelling the streaming capture...\n"); creqbuf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; if (ioctl(cfd, VIDIOC_STREAMOFF, &creqbuf.type) < 0) { perror("cam VIDIOC_STREAMOFF"); return -1; } if (ioctl(vfd, VIDIOC_STREAMOFF, &vreqbuf.type) == -1) { perror("dss VIDIOC_STREAMOFF"); return -1; } printf("Done\n"); break; } if (i >= 3) { cfilledbuffer.index = vfilledbuffer.index; while (ioctl(cfd, VIDIOC_QBUF, &cfilledbuffer) < 0) perror("cam VIDIOC_QBUF"); } } printf("Captured %d frames!\n", i); /* we didn't turn off streaming yet */ if (count == -1) { creqbuf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; if (ioctl(cfd, VIDIOC_STREAMOFF, &creqbuf.type) == -1) { perror("cam VIDIOC_STREAMOFF"); return -1; } if (ioctl(vfd, VIDIOC_STREAMOFF, &vreqbuf.type) == -1) { perror("dss VIDIOC_STREAMOFF"); return -1; } } for (i = 0; i < vreqbuf.count; i++) { if (vbuffers[i].start) munmap(vbuffers[i].start, vbuffers[i].length); } free(vbuffers); for (i = 0; i < creqbuf.count; i++) { if (cbuffers[i].start) { if (memtype == V4L2_MEMORY_USERPTR) free(cbuffers[i].start); else munmap(cbuffers[i].start, cbuffers[i].length); } } free(cbuffers); close(vfd); close(cfd); /************************************************************/ /* Reset DSS setup if it was modified for 720p */ if (req_width == 1280 && req_height == 720) dss_reset(vid); /************************************************************/ }
int main(int argc, char *argv[]) { struct { void *start; size_t length; } *vbuffers; /* Structure stores values for key strokes */ struct input_event { struct timeval time; unsigned short type; unsigned short code; unsigned int value; } keyinfo; struct v4l2_capability capability; struct v4l2_format cformat, vformat; struct v4l2_requestbuffers creqbuf, vreqbuf; struct v4l2_buffer cfilledbuffer, vfilledbuffer; int vid = 1, set_video_img = 0, i, ret; /*************************************************************/ unsigned int buff_size = 0; struct af_configuration af_config_user; struct isp_af_data af_data_user; __u16 *buff_preview = NULL; __u8 *stats_buff = NULL; unsigned int buff_prev_size = 0; int k; int input; __u16 wposn = 1; int frame_number; int j = 0, index = 1; FILE *fp_out; int framerate = 30; int mode = 1; int device = 1; struct v4l2_queryctrl queryctrl; struct v4l2_control control; af_config_user.alaw_enable = H3A_AF_ALAW_ENABLE; /* Enable Alaw */ af_config_user.hmf_config.enable = H3A_AF_HMF_DISABLE; af_config_user.iir_config.hz_start_pos = 0; af_config_user.paxel_config.height = 16; af_config_user.paxel_config.width = 16; af_config_user.paxel_config.line_incr = 0; af_config_user.paxel_config.vt_start = 0; af_config_user.paxel_config.hz_start = 2; af_config_user.paxel_config.hz_cnt = 8; af_config_user.paxel_config.vt_cnt = 8; af_config_user.af_config = H3A_AF_CFG_ENABLE; af_config_user.hmf_config.threshold = 0; /* Set Accumulator mode */ af_config_user.mode = ACCUMULATOR_SUMMED; for (index = 0; index < 11; index++) { af_config_user.iir_config.coeff_set0[index] = 12; af_config_user.iir_config.coeff_set1[index] = 12; } /* ********************************************************* */ fp_out = fopen("af_mid.st", "wb"); if (fp_out == NULL) { printf("ERROR opening output file!\n"); return -EACCES; } /* ********************************************************* */ /* Open keypad input device */ int kfd = open("/dev/input/event0", O_RDONLY | O_NONBLOCK); index = 1; if (argc < 2) { printf("ERROR: Missing parameters!\n"); usage(); return 0; } if ((argc > 1) && (!strcmp(argv[1], "?"))) { usage(); return 0; } if (argc > index) { device = atoi(argv[index]); index++; } if (argc > index) { vid = atoi(argv[index]); if ((vid != 1) && (vid != 2)) { printf("vid has to be 1 or 2! vid=%d, argv[1]=%s\n", vid, argv[1]); usage(); return 0; } } index++; if (argc > index) { wposn = atoi(argv[index]); index++; } if (argc > index) { framerate = atoi(argv[index]); printf("Framerate = %d\n", framerate); index++; } else printf("Using framerate = 30, default value\n"); if (argc > index) { mode = atoi(argv[index]); printf("Mode = %s", (mode == 1) ? "Auto" : "Manual"); } else printf("Default mode = Manual"); cfd = open_cam_device(O_RDWR, device); if (cfd <= 0) { printf("Could not open the cam device\n"); return -1; } vfd = open((vid == 1) ? VIDEO_DEVICE1 : VIDEO_DEVICE2, O_RDWR); if (vfd <= 0) { printf("Could not open %s\n", (vid == 1) ? VIDEO_DEVICE1 : VIDEO_DEVICE2); return -1; } else { printf("openned %s for rendering\n", (vid == 1) ? VIDEO_DEVICE1 : VIDEO_DEVICE2); } if (ioctl(vfd, VIDIOC_QUERYCAP, &capability) == -1) { perror("video VIDIOC_QUERYCAP"); return -1; } if (capability.capabilities & V4L2_CAP_STREAMING) printf("The video driver is capable of Streaming!\n"); else { printf("The video driver is not capable of Streaming!\n"); return -1; } if (ioctl(cfd, VIDIOC_QUERYCAP, &capability) < 0) { perror("VIDIOC_QUERYCAP"); return -1; } if (capability.capabilities & V4L2_CAP_STREAMING) printf("The camera driver is capable of Streaming!\n"); else { printf("The camera driver is not capable of Streaming!\n"); return -1; } memset(&queryctrl, 0, sizeof(queryctrl)); memset(&control, 0, sizeof(control)); queryctrl.id = V4L2_CID_FOCUS_ABSOLUTE; if (ioctl(cfd, VIDIOC_QUERYCTRL, &queryctrl) == -1) { printf("FOCUS_ABSOLUTE is not supported!\n"); return -1; } printf("V4L2_CID_FOCUS_ABSOLUTE support detected:\n"); printf("\tmin: %d\n", queryctrl.minimum); printf("\tmax: %d\n", queryctrl.maximum); if (wposn == 1) { wposn = queryctrl.maximum; /* MACRO */ } else if (wposn == 2) { wposn = (queryctrl.minimum + queryctrl.maximum) / 2; } else if (wposn == 3) { wposn = queryctrl.minimum; /* Infinite */ } else { printf("Invalid Focus \n"); return -1; } ret = setFramerate(cfd, framerate); if (ret < 0) { printf("ERROR: VIDIOC_S_PARM ioctl cam\n"); return -1; } ret = cam_ioctl(cfd, DEFAULT_PIXEL_FMT, DEFAULT_VIDEO_SIZE); if (ret < 0) { printf("ERROR: VIDIOC_S_FMT ioctl cam\n"); return -1; } cformat.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; ret = ioctl(cfd, VIDIOC_G_FMT, &cformat); if (ret < 0) { perror("cam VIDIOC_G_FMT"); return -1; } printf("Camera Image width = %d, Image height = %d, size = %d\n", cformat.fmt.pix.width, cformat.fmt.pix.height, cformat.fmt.pix.sizeimage); vformat.type = V4L2_BUF_TYPE_VIDEO_OUTPUT; ret = ioctl(vfd, VIDIOC_G_FMT, &vformat); if (ret < 0) { perror("video VIDIOC_G_FMT"); return -1; } printf("Video Image width = %d, Image height = %d, size = %d\n", vformat.fmt.pix.width, vformat.fmt.pix.height, vformat.fmt.pix.sizeimage); if ((cformat.fmt.pix.width != vformat.fmt.pix.width) || (cformat.fmt.pix.height != vformat.fmt.pix.height)) { printf("image sizes don't match!\n"); set_video_img = 1; } if (cformat.fmt.pix.pixelformat != vformat.fmt.pix.pixelformat) { printf("pixel formats don't match!\n"); set_video_img = 1; } if (set_video_img) { printf("set video image the same as camera image ...\n"); vformat.fmt.pix.width = cformat.fmt.pix.width; vformat.fmt.pix.height = cformat.fmt.pix.height; vformat.fmt.pix.sizeimage = cformat.fmt.pix.sizeimage; vformat.fmt.pix.pixelformat = cformat.fmt.pix.pixelformat; ret = ioctl(vfd, VIDIOC_S_FMT, &vformat); if (ret < 0) { perror("video VIDIOC_S_FMT"); return -1; } if ((cformat.fmt.pix.width != vformat.fmt.pix.width) || (cformat.fmt.pix.height != vformat.fmt.pix.height) || (cformat.fmt.pix.pixelformat != vformat.fmt.pix.pixelformat)) { printf("can't make camera and video image" " compatible!\n"); return 0; } } vreqbuf.type = V4L2_BUF_TYPE_VIDEO_OUTPUT; vreqbuf.memory = V4L2_MEMORY_MMAP; vreqbuf.count = 4; if (ioctl(vfd, VIDIOC_REQBUFS, &vreqbuf) == -1) { perror("video VIDEO_REQBUFS"); return; } printf("Video Driver allocated %d buffers when 4 are requested\n", vreqbuf.count); vbuffers = calloc(vreqbuf.count, sizeof(*vbuffers)); for (i = 0; i < vreqbuf.count ; ++i) { struct v4l2_buffer buffer; buffer.type = vreqbuf.type; buffer.index = i; if (ioctl(vfd, VIDIOC_QUERYBUF, &buffer) == -1) { perror("video VIDIOC_QUERYBUF"); return; } vbuffers[i].length = buffer.length; vbuffers[i].start = mmap(NULL, buffer.length, PROT_READ | PROT_WRITE, MAP_SHARED, vfd, buffer.m.offset); if (vbuffers[i].start == MAP_FAILED) { perror("video mmap"); return; } printf("Video Buffers[%d].start = %x length = %d\n", i, vbuffers[i].start, vbuffers[i].length); } creqbuf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; creqbuf.memory = V4L2_MEMORY_USERPTR; creqbuf.count = 4; printf("Requesting %d buffers of type V4L2_MEMORY_USERPTR\n", creqbuf.count); if (ioctl(cfd, VIDIOC_REQBUFS, &creqbuf) < 0) { perror("cam VIDEO_REQBUFS"); return -1; } printf("Camera Driver allowed %d buffers\n", creqbuf.count); for (i = 0; i < creqbuf.count; ++i) { struct v4l2_buffer buffer; buffer.type = creqbuf.type; buffer.memory = creqbuf.memory; buffer.index = i; if (ioctl(cfd, VIDIOC_QUERYBUF, &buffer) < 0) { perror("cam VIDIOC_QUERYBUF"); return -1; } buffer.flags = 0; buffer.m.userptr = (unsigned long) vbuffers[i].start; buffer.length = vbuffers[i].length; if (ioctl(cfd, VIDIOC_QBUF, &buffer) < 0) { perror("cam VIDIOC_QBUF"); return -1; } } /* set h3a params */ ret = ioctl(cfd, VIDIOC_PRIVATE_ISP_AF_CFG, &af_config_user); if (ret < 0) { printf("Error: %d, ", ret); perror("ISP_AF_CFG 1"); return ret; } /* turn on streaming on both drivers */ if (ioctl(cfd, VIDIOC_STREAMON, &creqbuf.type) < 0) { perror("cam VIDIOC_STREAMON"); return -1; } /* caputure 1000 frames */ cfilledbuffer.type = creqbuf.type; vfilledbuffer.type = vreqbuf.type; i = 0; buff_size = (af_config_user.paxel_config.hz_cnt + 1) * (af_config_user.paxel_config.vt_cnt + 1) * AF_PAXEL_SIZE; stats_buff = malloc(buff_size); buff_prev_size = (buff_size / 2); af_data_user.af_statistics_buf = NULL; af_data_user.update = REQUEST_STATISTICS; af_data_user.af_statistics_buf = stats_buff; af_data_user.frame_number = 8; /* dummy */ printf("Setting first parameters \n"); ret = ioctl(cfd, VIDIOC_PRIVATE_ISP_AF_REQ, &af_data_user); if (ret < 0) perror("ISP_AF_REQ 1"); printf("Frame No %d\n", af_data_user.frame_number); control.id = V4L2_CID_FOCUS_ABSOLUTE; if (ioctl(cfd, VIDIOC_G_CTRL, &control) == -1) perror("cam VIDIOC_G_CTRL\n"); printf("Lens Crt %d\n", control.value); control.id = V4L2_CID_FOCUS_ABSOLUTE; control.value = wposn; if (ioctl(cfd, VIDIOC_S_CTRL, &control) == -1) perror("cam VIDIOC_S_CTRL\n"); printf("Lens Des %d\n", control.value); printf("Frame Curr %d\n", af_data_user.curr_frame); af_data_user.frame_number = af_data_user.curr_frame + 10; request: frame_number = af_data_user.frame_number; /* request stats */ af_data_user.update = REQUEST_STATISTICS; af_data_user.af_statistics_buf = stats_buff; printf("Requesting stats for frame %d, try %d\n", frame_number, j); ret = ioctl(cfd, VIDIOC_PRIVATE_ISP_AF_REQ, &af_data_user); if (ret < 0) perror("ISP_AF_REQ 2"); printf("Frame No %d\n", af_data_user.frame_number); control.id = V4L2_CID_FOCUS_ABSOLUTE; if (ioctl(cfd, VIDIOC_G_CTRL, &control) == -1) perror("cam VIDIOC_G_CTRL\n"); printf("Lens Crt %d\n", control.value); control.id = V4L2_CID_FOCUS_ABSOLUTE; control.value = wposn; if (ioctl(cfd, VIDIOC_S_CTRL, &control) == -1) perror("cam VIDIOC_S_CTRL\n"); printf("Lens Des %d\n", control.value); printf("xs.ts %d:%d\n", af_data_user.xtrastats.ts.tv_sec, af_data_user.xtrastats.ts.tv_usec); printf("xs.field_count %d\n", af_data_user.xtrastats.field_count); printf("xs.lens_position %d\n", af_data_user.xtrastats.lens_position); if (af_data_user.af_statistics_buf == NULL) { printf("NULL buffer, current frame is %d.\n", af_data_user.curr_frame); af_data_user.frame_number = af_data_user.curr_frame + 5; af_data_user.update = REQUEST_STATISTICS; af_data_user.af_statistics_buf = stats_buff; goto request; } else { /* Display stats */ buff_preview = (__u16 *)af_data_user.af_statistics_buf; printf("H3A AE/AWB: buffer to display = %d data pointer = %p\n", buff_prev_size, af_data_user.af_statistics_buf); for (k = 0; k < 1024; k++) fprintf(fp_out, "%6x\n", buff_preview[k]); } int bytes; while (i < 1000) { /* De-queue the next avaliable buffer */ while (ioctl(cfd, VIDIOC_DQBUF, &cfilledbuffer) < 0) perror("cam VIDIOC_DQBUF"); vfilledbuffer.index = cfilledbuffer.index; vfilledbuffer.type = V4L2_BUF_TYPE_VIDEO_OUTPUT; vfilledbuffer.memory = V4L2_MEMORY_MMAP; vfilledbuffer.m.userptr = (unsigned int)(vbuffers[cfilledbuffer.index].start); vfilledbuffer.length = cfilledbuffer.length; if (ioctl(vfd, VIDIOC_QBUF, &vfilledbuffer) < 0) { perror("dss VIDIOC_QBUF"); return -1; } i++; if (i == 3) { /* Turn on streaming for video */ if (ioctl(vfd, VIDIOC_STREAMON, &vreqbuf.type)) { perror("dss VIDIOC_STREAMON"); return -1; } } if (i >= 3) { /* De-queue the previous buffer from video driver */ if (ioctl(vfd, VIDIOC_DQBUF, &vfilledbuffer)) { perror("dss VIDIOC_DQBUF"); return; } cfilledbuffer.index = vfilledbuffer.index; while (ioctl(cfd, VIDIOC_QBUF, &cfilledbuffer) < 0) perror("cam VIDIOC_QBUF"); } switch (mode) { case 1: if (kbhit()) { input = getch(); if (input == '1') { af_data_user.update = 0; ret = ioctl(cfd, VIDIOC_PRIVATE_ISP_AF_REQ, &af_data_user); if (ret < 0) { perror("ISP_AF_REQ 3"); return ret; } af_data_user.frame_number = af_data_user.curr_frame; af_data_user.update = REQUEST_STATISTICS; af_data_user.af_statistics_buf = stats_buff; ret = ioctl(cfd, VIDIOC_PRIVATE_ISP_AF_REQ, &af_data_user); if (ret < 0) { perror("ISP_AF_REQ 4"); } else { printf("Frame No %d\n", af_data_user.frame_number); printf("xs.ts %d:%d\n", af_data_user.xtrastats. ts.tv_sec, af_data_user.xtrastats. ts.tv_usec); printf("xs.field_count %d\n", af_data_user. xtrastats.field_count); printf("xs.lens_position %d\n", af_data_user.xtrastats. lens_position); buff_preview = (__u16 *)af_data_user. af_statistics_buf; printf("H3A AE/AWB: buffer to display = %d" " data pointer = %p\n", buff_prev_size, af_data_user. af_statistics_buf); } } else if (input == '2') { if (wposn > queryctrl.minimum) wposn--; control.id = V4L2_CID_FOCUS_ABSOLUTE; control.value = wposn; if (ioctl(cfd, VIDIOC_S_CTRL, &control) == -1) perror("cam VIDIOC_S_CTRL\n"); printf("Lens position (-1): %d\n", control.value); } else if (input == '3') { if (wposn < queryctrl.maximum) wposn++; control.id = V4L2_CID_FOCUS_ABSOLUTE; control.value = wposn; if (ioctl(cfd, VIDIOC_S_CTRL, &control) == -1) perror("cam VIDIOC_S_CTRL\n"); printf("Lens position (+1): %d\n", control.value); } else if (input == '4') { wposn = queryctrl.maximum; control.id = V4L2_CID_FOCUS_ABSOLUTE; control.value = wposn; if (ioctl(cfd, VIDIOC_S_CTRL, &control) == -1) perror("cam VIDIOC_S_CTRL\n"); printf("Lens position (macro): %d\n", control.value); } else if (input == '5') { wposn = (queryctrl.minimum + queryctrl.maximum) / 2; control.id = V4L2_CID_FOCUS_ABSOLUTE; control.value = wposn; if (ioctl(cfd, VIDIOC_S_CTRL, &control) == -1) perror("cam VIDIOC_S_CTRL\n"); printf("Lens position (intermediate): %d\n", control.value); } else if (input == '6') { wposn = queryctrl.minimum; control.id = V4L2_CID_FOCUS_ABSOLUTE; control.value = wposn; if (ioctl(cfd, VIDIOC_S_CTRL, &control) == -1) perror("cam VIDIOC_S_CTRL\n"); printf("Lens position (infinite): %d\n", control.value); } else if (input == 'q') { break; } } break; case 2: bytes = read(kfd, &keyinfo, sizeof(struct input_event)); input = keyinfo.code; printf("Keycode: %d, Bytes: %d\r", input, bytes); fflush(stdout); if ((bytes < 0) && (errno != EAGAIN)) return 1; if (bytes > 0 && input == 22) { af_data_user.update = 0; ret = ioctl(cfd, VIDIOC_PRIVATE_ISP_AF_REQ, &af_data_user); if (ret < 0) { perror("ISP_AF_REQ 3"); return ret; } af_data_user.frame_number = af_data_user.curr_frame; af_data_user.update = REQUEST_STATISTICS; af_data_user.af_statistics_buf = stats_buff; ret = ioctl(cfd, VIDIOC_PRIVATE_ISP_AF_REQ, &af_data_user); if (ret < 0) { perror("ISP_AF_REQ 4"); return ret; } printf("Frame No %d\n", af_data_user.frame_number); printf("xs.ts %d:%d\n", af_data_user.xtrastats. ts.tv_sec, af_data_user.xtrastats. ts.tv_usec); printf("xs.field_count %d\n", af_data_user. xtrastats.field_count); printf("xs.lens_position %d\n", af_data_user.xtrastats. lens_position); buff_preview = (__u16 *)af_data_user. af_statistics_buf; printf("H3A AE/AWB: buffer to display = %d" " data pointer = %p\n", buff_prev_size, af_data_user. af_statistics_buf); } else if (bytes > 0 && (keyinfo.code == 35)) { if (wposn > 0) wposn--; control.id = V4L2_CID_FOCUS_ABSOLUTE; control.value = wposn; if (ioctl(cfd, VIDIOC_S_CTRL, &control) == -1) perror("cam VIDIOC_S_CTRL\n"); printf("Lens position (-1): %d\n", control.value); } else if (bytes > 0 && (keyinfo.code == 33)) { if (wposn < 0xFF) wposn++; control.id = V4L2_CID_FOCUS_ABSOLUTE; control.value = wposn; if (ioctl(cfd, VIDIOC_S_CTRL, &control) == -1) perror("cam VIDIOC_S_CTRL\n"); printf("Lens position (+1): %d\n", control.value); } else if (bytes > 0 && (keyinfo.code == 36)) { wposn = queryctrl.maximum; control.id = V4L2_CID_FOCUS_ABSOLUTE; control.value = wposn; if (ioctl(cfd, VIDIOC_S_CTRL, &control) == -1) perror("cam VIDIOC_S_CTRL\n"); printf("Lens position (macro): %d\n", control.value); } else if (bytes > 0 && (keyinfo.code == 49)) { wposn = (queryctrl.minimum + queryctrl.maximum) / 2; control.id = V4L2_CID_FOCUS_ABSOLUTE; control.value = wposn; if (ioctl(cfd, VIDIOC_S_CTRL, &control) == -1) perror("cam VIDIOC_S_CTRL\n"); printf("Lens position (intermediate): %d\n", control.value); } else if (bytes > 0 && (keyinfo.code == 47)) { wposn = queryctrl.minimum; control.id = V4L2_CID_FOCUS_ABSOLUTE; control.value = wposn; if (ioctl(cfd, VIDIOC_S_CTRL, &control) == -1) perror("cam VIDIOC_S_CTRL\n"); printf("Lens position (infinite): %d\n", control.value); } else if (bytes > 0 && (keyinfo.code == 37)) { break; } break; } /* ******************************* */ } printf("Captured and rendered %d frames!\n", i); if (ioctl(cfd, VIDIOC_STREAMOFF, &creqbuf.type) == -1) { perror("cam VIDIOC_STREAMOFF"); return -1; } if (ioctl(vfd, VIDIOC_STREAMOFF, &vreqbuf.type) == -1) { perror("video VIDIOC_STREAMOFF"); return -1; } for (i = 0; i < vreqbuf.count; i++) { if (vbuffers[i].start) munmap(vbuffers[i].start, vbuffers[i].length); } free(vbuffers); free(stats_buff); close(cfd); close(vfd); }
LEDEffect::LEDEffect() { setFramerate(1000, FRAME_ORDER_SEQUENTIAL); }