/** * WAV録音準備 * @param filename 保存ファイル名 * @param channel チャンネル * @param rate レート * @param bits ビット数 * @param interval 取得タイミング */ void openWAV(const tjs_char *filename, int channel, int rate, int bits, int interval) { closeWAV(); // ファイルを開く wvout = TVPCreateIStream(filename, TJS_BS_WRITE); // フォーマットを指定 WAVEFORMATEX waveForm; waveForm.wFormatTag = WAVE_FORMAT_PCM; waveForm.nChannels = channel; waveForm.nSamplesPerSec = rate; waveForm.wBitsPerSample = bits; waveForm.nBlockAlign = waveForm.nChannels * waveForm.wBitsPerSample / 8; waveForm.nAvgBytesPerSec = waveForm.nSamplesPerSec * waveForm.nBlockAlign; // waveIn を開く if (waveInOpen(&hwi, WAVE_MAPPER, &waveForm, (DWORD)waveInProc, (DWORD)this, CALLBACK_FUNCTION) != MMSYSERR_NOERROR) { TVPThrowExceptionMessage(L"waveInOpen"); } /* キャプチャバッファ確保 */ int length = waveForm.nAvgBytesPerSec * interval / 1000; wvhdr.lpData = new char[length]; wvhdr.dwBufferLength = length; wvhdr.dwFlags = 0; wvhdr.reserved = 0; // バッファを設定 waveInPrepareHeader(hwi, &wvhdr, sizeof(wvhdr)); waveInAddBuffer(hwi, &wvhdr, sizeof(wvhdr)); }
// Open wave input channel int OpenWaveIn(HWAVEIN* hWaveIn, WAVEFORMATEX* wf) { int res; char lpTemp[256]; res = waveInGetNumDevs(); if (! res ) { _debug_print("Access WaveIn channel FAILED!",1); return -1; } else { _debug_print("Access WaveIn channel SUCCEED!"); } // Open wave input channel res = waveInOpen(hWaveIn,WAVE_MAPPER, wf, (DWORD)NULL,0L,CALLBACK_WINDOW); if ( res != MMSYSERR_NOERROR ) { sprintf(lpTemp, "Open wave input channel FAILED£¬Error_Code = 0x%x", res ); _debug_print(lpTemp,1); return -1; } else { _debug_print("Open wave input channel SUCCEED!"); } return 0; }
int main() { HWAVEIN inStream; HWAVEOUT outStream; WAVEFORMATEX waveFormat; WAVEHDR buffer[4]; // pingpong buffers waveFormat.wFormatTag = WAVE_FORMAT_PCM; // PCM audio waveFormat.nSamplesPerSec = 22050; // really 22050 frames/sec waveFormat.nChannels = 2; // stereo waveFormat.wBitsPerSample = 16; // 16bits per sample waveFormat.cbSize = 0; // no extra data waveFormat.nBlockAlign = waveFormat.nChannels * waveFormat.wBitsPerSample / 2; waveFormat.nAvgBytesPerSec = waveFormat.nBlockAlign * waveFormat.nSamplesPerSec; // Event: default security descriptor, Manual Reset, initial non-signaled HANDLE event = CreateEvent(NULL, TRUE, FALSE, "waveout event"); waveInOpen( &inStream, WAVE_MAPPER, &waveFormat, (unsigned long)event, 0, CALLBACK_EVENT); waveOutOpen(&outStream, WAVE_MAPPER, &waveFormat, (unsigned long)event, 0, CALLBACK_EVENT); // initialize the input and output PingPong buffers int index; for(index = 0; index < 4; index++) { buffer[index].dwBufferLength = NUM_FRAMES * waveFormat.nBlockAlign; buffer[index].lpData = (void *)malloc(NUM_FRAMES * waveFormat.nBlockAlign); buffer[index].dwFlags = 0; waveInPrepareHeader( inStream, &buffer[index], sizeof(WAVEHDR)); } ResetEvent(event); for(index= 0; index < 4; index++) // queue all buffers for input waveInAddBuffer(inStream, &buffer[index], sizeof(WAVEHDR)); waveInStart(inStream); while(!( buffer[1].dwFlags & WHDR_DONE)); // poll(!) for 2 full input buffers // move the two full buffers to output waveOutWrite(outStream, &buffer[0], sizeof(WAVEHDR)); waveOutWrite(outStream, &buffer[1], sizeof(WAVEHDR)); int inIndex = 2, outIndex = 0; // the next input and output to watch while(1) { // poll for completed input and output buffers if(buffer[inIndex].dwFlags & WHDR_DONE) { // input buffer complete? waveInAddBuffer( inStream, &buffer[inIndex], sizeof(WAVEHDR)); inIndex = (inIndex+1)%4; // next buffer to watch for full } if(buffer[outIndex].dwFlags & WHDR_DONE) { // output buffer complete? waveOutWrite( outStream, &buffer[outIndex], sizeof(WAVEHDR)); outIndex = (outIndex+1)%4; // next buffer to watch for empty } } }
static DWORD widOpenHelper(WAVEMAPDATA* wim, UINT idx, LPWAVEOPENDESC lpDesc, LPWAVEFORMATEX lpwfx, DWORD dwFlags) { DWORD ret; TRACE("(%p %04x %p %p %08x)\n", wim, idx, lpDesc, lpwfx, dwFlags); /* source is always PCM, so the formulas below apply */ lpwfx->nBlockAlign = (lpwfx->nChannels * lpwfx->wBitsPerSample) / 8; lpwfx->nAvgBytesPerSec = lpwfx->nSamplesPerSec * lpwfx->nBlockAlign; if (dwFlags & WAVE_FORMAT_QUERY) { ret = acmStreamOpen(NULL, 0, lpwfx, lpDesc->lpFormat, NULL, 0L, 0L, ACM_STREAMOPENF_QUERY); } else { ret = acmStreamOpen(&wim->hAcmStream, 0, lpwfx, lpDesc->lpFormat, NULL, 0L, 0L, 0L); } if (ret == MMSYSERR_NOERROR) { ret = waveInOpen(&wim->u.in.hInnerWave, idx, lpwfx, (DWORD_PTR)widCallback, (DWORD_PTR)wim, (dwFlags & ~CALLBACK_TYPEMASK) | CALLBACK_FUNCTION); if (ret != MMSYSERR_NOERROR && !(dwFlags & WAVE_FORMAT_QUERY)) { acmStreamClose(wim->hAcmStream, 0); wim->hAcmStream = 0; } } TRACE("ret = %08x\n", ret); return ret; }
static HWAVEIN wavein_open(int32 samples_per_sec, int32 bytes_per_sample) { WAVEFORMATEX wfmt; int32 st; HWAVEIN h; if (bytes_per_sample != sizeof(int16)) { fprintf(stderr, "bytes/sample != %d\n", sizeof(int16)); return NULL; } wfmt.wFormatTag = WAVE_FORMAT_PCM; wfmt.nChannels = 1; wfmt.nSamplesPerSec = samples_per_sec; wfmt.nAvgBytesPerSec = samples_per_sec * bytes_per_sample; wfmt.nBlockAlign = bytes_per_sample; wfmt.wBitsPerSample = 8 * bytes_per_sample; /* There should be a check here for a device of the desired type; later... */ st = waveInOpen((LPHWAVEIN) & h, WAVE_MAPPER, (LPWAVEFORMATEX) & wfmt, (DWORD) 0L, 0L, (DWORD) CALLBACK_NULL); if (st != 0) { wavein_error("waveInOpen", st); return NULL; } return h; }
static void wave_in_tests(void) { WAVEINCAPSA capsA; WAVEINCAPSW capsW; WAVEFORMATEX format; HWAVEIN win; MMRESULT rc; DWORD preferred, status; UINT ndev,d; ndev=waveInGetNumDevs(); trace("found %d WaveIn devices\n",ndev); rc = waveInMessage((HWAVEIN)WAVE_MAPPER, DRVM_MAPPER_PREFERRED_GET, (DWORD_PTR)&preferred, (DWORD_PTR)&status); ok((ndev == 0 && (rc == MMSYSERR_NODRIVER || rc == MMSYSERR_BADDEVICEID)) || rc == MMSYSERR_NOTSUPPORTED || rc == MMSYSERR_NOERROR, "waveInMessage(DRVM_MAPPER_PREFERRED_GET) failed: %u\n", rc); if(rc != MMSYSERR_NOTSUPPORTED) ok((ndev == 0 && (preferred == -1 || broken(preferred != -1))) || preferred < ndev, "Got invalid preferred device: 0x%x\n", preferred); rc=waveInGetDevCapsA(ndev+1,&capsA,sizeof(capsA)); ok(rc==MMSYSERR_BADDEVICEID, "waveInGetDevCapsA(%s): MMSYSERR_BADDEVICEID expected, got %s\n", dev_name(ndev+1),wave_in_error(rc)); rc=waveInGetDevCapsA(WAVE_MAPPER,&capsA,sizeof(capsA)); ok(rc==MMSYSERR_NOERROR || rc==MMSYSERR_NODRIVER || (!ndev && (rc==MMSYSERR_BADDEVICEID)), "waveInGetDevCapsA(%s): got %s\n",dev_name(WAVE_MAPPER),wave_in_error(rc)); rc=waveInGetDevCapsW(ndev+1,&capsW,sizeof(capsW)); ok(rc==MMSYSERR_BADDEVICEID || rc==MMSYSERR_NOTSUPPORTED, "waveInGetDevCapsW(%s): MMSYSERR_BADDEVICEID or MMSYSERR_NOTSUPPORTED " "expected, got %s\n",dev_name(ndev+1),wave_in_error(rc)); rc=waveInGetDevCapsW(WAVE_MAPPER,&capsW,sizeof(capsW)); ok(rc==MMSYSERR_NOERROR || rc==MMSYSERR_NODRIVER || rc==MMSYSERR_NOTSUPPORTED || (!ndev && (rc==MMSYSERR_BADDEVICEID)), "waveInGetDevCapsW(%s): got %s\n", dev_name(ndev+1),wave_in_error(rc)); format.wFormatTag=WAVE_FORMAT_PCM; format.nChannels=2; format.wBitsPerSample=16; format.nSamplesPerSec=44100; format.nBlockAlign=format.nChannels*format.wBitsPerSample/8; format.nAvgBytesPerSec=format.nSamplesPerSec*format.nBlockAlign; format.cbSize=0; rc=waveInOpen(&win,ndev+1,&format,0,0,CALLBACK_NULL); ok(rc==MMSYSERR_BADDEVICEID, "waveInOpen(%s): MMSYSERR_BADDEVICEID expected, got %s\n", dev_name(ndev+1),wave_in_error(rc)); for (d=0;d<ndev;d++) wave_in_test_device(d); if (ndev>0) wave_in_test_device(WAVE_MAPPER); }
static void winsnd_read_preprocess(MSFilter *f){ WinSnd *d=(WinSnd*)f->data; MMRESULT mr; int i; int bsize; DWORD dwFlag; d->stat_input=0; d->stat_output=0; d->stat_notplayed=0; d->stat_minimumbuffer=WINSND_MINIMUMBUFFER; winsnd_apply_settings(d); /* Init Microphone device */ dwFlag = CALLBACK_FUNCTION; if (d->dev_id != WAVE_MAPPER) dwFlag = WAVE_MAPPED | CALLBACK_FUNCTION; mr = waveInOpen (&d->indev, d->dev_id, &d->wfx, (DWORD) read_callback, (DWORD)f, dwFlag); if (mr != MMSYSERR_NOERROR) { ms_error("Failed to prepare windows sound device. (waveInOpen:0x%i)", mr); mr = waveInOpen (&d->indev, WAVE_MAPPER, &d->wfx, (DWORD) read_callback, (DWORD)d, CALLBACK_FUNCTION); if (mr != MMSYSERR_NOERROR) { d->indev=NULL; ms_error("Failed to prepare windows sound device. (waveInOpen:0x%i)", mr); return ; } } bsize=WINSND_NSAMPLES*d->wfx.nAvgBytesPerSec/8000; ms_debug("Using input buffers of %i bytes",bsize); for(i=0;i<WINSND_NBUFS;++i){ WAVEHDR *hdr=&d->hdrs_read[i]; add_input_buffer(d,hdr,bsize); } d->running=TRUE; mr=waveInStart(d->indev); if (mr != MMSYSERR_NOERROR){ ms_error("waveInStart() error"); return ; } #ifndef _TRUE_TIME ms_ticker_set_time_func(f->ticker,winsnd_get_cur_time,d); #endif }
void initializeBuffer() { sts = waveInOpen(&hWaveIn, WAVE_MAPPER,&pwfx,0L, 0L, WAVE_FORMAT_DIRECT); if(sts) std::cout << "Failed to open waveform input device." << std::endl; WaveInHeaderinit(); }
/** * win32ai_open(int sample_rate) * Setup audio for input at specified rate. * returns -1 on error, 0 on happy. */ int win32ai_open(int sample_rate) { WAVEFORMATEX waveFormat; MMRESULT res; int i; // create an event by which audio driver will notify us whinEvent = CreateEvent(NULL, FALSE, FALSE, NULL); // populate whinFormat struct waveFormat.wFormatTag = WAVE_FORMAT_PCM; waveFormat.nChannels = 1; waveFormat.nSamplesPerSec = sample_rate; waveFormat.nAvgBytesPerSec = sample_rate * SAMPLE_BITS / 8; waveFormat.nBlockAlign = SAMPLE_BITS / 8; waveFormat.wBitsPerSample = SAMPLE_BITS; waveFormat.cbSize = 0; whinBufNo = 0; whinBufIndex = 0; // open audio device res = waveInOpen(&wavein, WAVE_MAPPER, &waveFormat, (DWORD) whinEvent, (DWORD) 0, CALLBACK_EVENT); if (checkWaveInResult(wavein, res, "waveInOpen")) return -1; // create buffers for (i = 0; i < WaveBuf_N; ++i) { // allocate buffer header whin[i] = (WAVEHDR*) calloc(1, sizeof(WAVEHDR)); if (whin[i] == NULL) { perror("malloc WAVEHDR"); return -1; /* need to cleanup XXX */ } // allocate buffer whin[i]->lpData = malloc(WaveBuf_SIZE); if (whin[i]->lpData == NULL) { perror("new char[WaveBuf_SIZE]"); return -1; /* need to cleanup XXX */ } whin[i]->dwBufferLength = WaveBuf_SIZE; // prepare buffer res = waveInPrepareHeader(wavein, whin[i], sizeof(WAVEHDR)); if (checkWaveInResult(wavein, res, "waveInPrepareHeader")) return -1; // give buffer to driver res = waveInAddBuffer(wavein, whin[i], sizeof(WAVEHDR)); if (checkWaveInResult(wavein, res, "waveInAddBuffer")) return -1; /* need to cleanup XXX */ } // start device, yeeeeeee haw! res = waveInStart(wavein); if (checkWaveInResult(wavein, res, "waveInStart")) return -1; /* need to cleanup XXX */ return 0; }
static int read_stream_open(struct ausrc_st *st, const struct ausrc_prm *prm, unsigned int dev) { WAVEFORMATEX wfmt; MMRESULT res; uint32_t sampc; unsigned format; int i, err = 0; st->sampsz = aufmt_sample_size(prm->fmt); format = winwave_get_format(prm->fmt); if (format == WAVE_FORMAT_UNKNOWN) { warning("winwave: source: unsupported sample format (%s)\n", aufmt_name(prm->fmt)); return ENOTSUP; } /* Open an audio INPUT stream. */ st->wavein = NULL; st->pos = 0; st->rdy = false; sampc = prm->srate * prm->ch * prm->ptime / 1000; for (i = 0; i < READ_BUFFERS; i++) { memset(&st->bufs[i].wh, 0, sizeof(WAVEHDR)); st->bufs[i].mb = mbuf_alloc(st->sampsz * sampc); if (!st->bufs[i].mb) return ENOMEM; } wfmt.wFormatTag = format; wfmt.nChannels = prm->ch; wfmt.nSamplesPerSec = prm->srate; wfmt.wBitsPerSample = (WORD)(st->sampsz * 8); wfmt.nBlockAlign = (prm->ch * wfmt.wBitsPerSample) / 8; wfmt.nAvgBytesPerSec = wfmt.nSamplesPerSec * wfmt.nBlockAlign; wfmt.cbSize = 0; res = waveInOpen(&st->wavein, dev, &wfmt, (DWORD_PTR) waveInCallback, (DWORD_PTR) st, CALLBACK_FUNCTION | WAVE_FORMAT_DIRECT); if (res != MMSYSERR_NOERROR) { warning("winwave: waveInOpen: failed res=%d\n", res); return EINVAL; } /* Prepare enough IN buffers to suite at least 50ms of data */ for (i = 0; i < READ_BUFFERS; i++) err |= add_wave_in(st); waveInStart(st->wavein); return err; }
void CAudioInput::OpenMic() { SetWaveHDRInit(); SetWaveInitFormat(); waveInOpen(&h_input, 0, &my_wave_format, (DWORD)waveInProc, 0, CALLBACK_FUNCTION); waveInPrepareHeader(h_input, mp_wave_header, sizeof(WAVEHDR)); waveInAddBuffer(h_input, mp_wave_header, sizeof(WAVEHDR)); waveInStart(h_input); }
// Define: _start(WORD, DWORD, DWORD) (real) DWORD waveCapture::_start(const WORD uDevice, const DWORD dwBufferLength, const DWORD dwNumBuffers) { // only one start per session is permitted if(_recording) return ERR_ALREADYSTARTED; // I need dwNumBuffers in stop() as well so I'll set __dwNumBuffers under the counter __dwNumBuffers = dwNumBuffers; // Define WAVEFORMATEX Structure (WAVEFORMATEX wf): wf.wFormatTag = WAVE_FORMAT_PCM; wf.wBitsPerSample = _wBitsPerSample; wf.nChannels = _nChannels; wf.nSamplesPerSec = _dwSamplePerSec; wf.nBlockAlign = (wf.nChannels * wf.wBitsPerSample) / 8; wf.nAvgBytesPerSec = (wf.nSamplesPerSec * wf.nBlockAlign); wf.cbSize = 0; // Create event: hevent = CreateEvent(NULL,FALSE,FALSE,NULL); if(hevent == NULL) return ERR_EVENTNULL; // WaveInOpen if(waveInOpen(&hwi,uDevice,(LPWAVEFORMATEX)&wf,(DWORD)hevent,0,CALLBACK_EVENT) != MMSYSERR_NOERROR) return ERR_WAVEINOPEN; // Define WAVEHDR Structure: buff = new WAVEHDR*[dwNumBuffers]; for (int i = 0; i<(int)dwNumBuffers; i++) { buff[i] = new WAVEHDR; ZeroMemory(buff[i],sizeof(WAVEHDR)); buff[i]->lpData = (char*) malloc(dwBufferLength); buff[i]->dwBufferLength = dwBufferLength; buff[i]->dwBytesRecorded = 0; buff[i]->dwUser = 0; buff[i]->dwFlags = 0; buff[i]->dwLoops = 0; if(waveInPrepareHeader(hwi, buff[i], sizeof(WAVEHDR)) != MMSYSERR_NOERROR) return ERR_WAVEINPREPAREHEADER; if(waveInAddBuffer(hwi, buff[i], sizeof(WAVEHDR)) != MMSYSERR_NOERROR) return ERR_WAVEINADDBUFFER; } // Start capturing... if(waveInStart(hwi) != MMSYSERR_NOERROR) return ERR_WAVEINSTART; _dwBufferCount = 0; dwTotalBufferLength = 0; _recording = true; return ERR_NOERROR; }
//打开录音设备 BOOL CWaveIn::OpenDev() { //已经打开录音设备 if (m_bDevOpen) { //返回 return FALSE; } //设置录音设备输出格式 WAVEFORMATEX wfx = {0}; wfx.wFormatTag = WAVE_FORMAT_PCM; wfx.nChannels = m_wChannel; wfx.nSamplesPerSec = m_dwSample; wfx.nAvgBytesPerSec = m_wChannel * m_dwSample * m_wBit / 8; wfx.nBlockAlign = m_wBit * m_wChannel / 8; wfx.wBitsPerSample = m_wBit; wfx.cbSize = 0; //检查录音设备是否支持设定的输出格式 m_mmr = waveInOpen(0, WAVE_MAPPER, &wfx, 0, 0, WAVE_FORMAT_QUERY); //出错 if (m_mmr) { //返回 return FALSE; } //打开录音设备 m_mmr = waveInOpen(&m_hIn, WAVE_MAPPER, &wfx, m_dwAudioInId, s_dwInstance, CALLBACK_THREAD); //出错 if (m_mmr) { //返回 return FALSE; } //设置设备打开标记 m_bDevOpen = TRUE; //返回 return TRUE; }
/** * Prepare recording. Initialize and acquire recording resources. */ static javacall_result init_recording(recorder_handle* hRecord) { WAVEFORMATEX format; /* Setup wave in format */ format.wFormatTag = WAVE_FORMAT_PCM; format.cbSize = 0; format.wBitsPerSample = 16; format.nSamplesPerSec = 22050; format.nChannels = 1; format.nAvgBytesPerSec = format.nSamplesPerSec * (format.wBitsPerSample/8); format.nBlockAlign = format.nChannels; /* Open WAV in H/W */ if (NULL == hRecord->hWAVEIN) { MMRESULT mmReturn = 0; mmReturn = waveInOpen(&hRecord->hWAVEIN, WAVE_MAPPER, &format, (DWORD)waveInProc, (DWORD)hRecord, CALLBACK_FUNCTION); if (MMSYSERR_NOERROR != mmReturn) { hRecord->hWAVEIN = NULL; return JAVACALL_FAIL; } } /* Create WAV file for (temp) storage */ if (NULL == hRecord->hFile) { HMMIO hFile; /* Create temp file? If there is no pre-setted name. */ if (0x0 == hRecord->fileName[0]) { GetTempPath(MAX_PATH, hRecord->fileName); GetTempFileName(hRecord->fileName, "record", 0, hRecord->fileName); } if (JAVACALL_SUCCEEDED(create_wav_file(hRecord->fileName, &format, &hFile))) { hRecord->hFile = hFile; } else { waveInClose(hRecord->hWAVEIN); hRecord->hWAVEIN = NULL; return JAVACALL_FAIL; } } /* Add input buffer */ if (!JAVACALL_SUCCEEDED(add_input_buffer(hRecord, &format))) { waveInClose(hRecord->hWAVEIN); close_wav_file(hRecord); hRecord->hWAVEIN = NULL; return JAVACALL_FAIL; } return JAVACALL_OK; }
// Aquire and enabled audio card // return 0 if ok, -1 if failed int BAE_AquireAudioCapture(void *threadContext, unsigned long sampleRate, unsigned long channels, unsigned long bits, unsigned long *pCaptureHandle) { MMRESULT theErr; WAVEINCAPS caps; WAVEFORMATEX format; UINT deviceID; g_bitSize = bits; g_channels = channels; g_sampleRate = sampleRate; if (pCaptureHandle) { *pCaptureHandle = 0L; } deviceID = WAVE_MAPPER; if (g_soundDeviceIndex) { deviceID = g_soundDeviceIndex - 1; } theErr = waveInGetDevCaps(deviceID, &caps, sizeof(WAVEINCAPS)); if (theErr == MMSYSERR_NOERROR) { format.wFormatTag = WAVE_FORMAT_PCM; format.nSamplesPerSec = sampleRate; format.wBitsPerSample = (WORD)bits; format.nChannels = (WORD)channels; format.nBlockAlign = (WORD)((format.wBitsPerSample * format.nChannels) / 8); format.nAvgBytesPerSec = format.nSamplesPerSec * format.nBlockAlign; format.cbSize = 0; theErr = waveInOpen(&g_captureSound, deviceID, &format, 0L/*(DWORD)PV_AudioCaptureCallback*/, (DWORD)threadContext, CALLBACK_NULL /*CALLBACK_FUNCTION*/); if (theErr == MMSYSERR_NOERROR) { g_captureShutdown = FALSE; if (pCaptureHandle) { *pCaptureHandle = (unsigned long)g_captureSound; } } else { BAE_ReleaseAudioCapture(threadContext); } } return (theErr == MMSYSERR_NOERROR) ? 0 : -1; }
MMRESULT win_wave_open(snd_type snd, UINT devtoopen, HWAVE *hptr) { WAVEFORMATEX wfmt; int bperf; switch (snd->format.mode) { case SND_MODE_UNKNOWN: case SND_MODE_PCM: /* note: Windows uses signed PCM (PCM) for 16-bit data, and * unsigned PCM (UPCM) for 8-bit data */ if (snd->format.bits != 8) { wfmt.wFormatTag = WAVE_FORMAT_PCM; break; } else return WAVERR_BADFORMAT; case SND_MODE_ADPCM: case SND_MODE_ALAW: case SND_MODE_ULAW: case SND_MODE_FLOAT: case SND_MODE_UPCM: if (snd->format.bits == 8) { wfmt.wFormatTag = WAVE_FORMAT_PCM; break; } else return(WAVERR_BADFORMAT); default: return(MMSYSERR_INVALPARAM); } wfmt.nChannels = (unsigned short) snd->format.channels; wfmt.nSamplesPerSec = (long) (snd->format.srate + 0.5); bperf = snd_bytes_per_frame(snd); wfmt.nAvgBytesPerSec = (long) (bperf * snd->format.srate + 0.5); wfmt.nBlockAlign = (unsigned short) bperf; wfmt.wBitsPerSample = (unsigned short) snd->format.bits; wfmt.cbSize = 0; if (snd->write_flag != SND_READ) { /* Under Windows2000 on an IBM A21p Thinkpad, waveOutOpen will CRASH (!) if snd->format.srate is 110250, but it actually worked with 4000, and it worked with 96000, so I will assume sample rates above 96000 are bad. -RBD 24Jul02 */ if (wfmt.nSamplesPerSec > 96000) { return WAVERR_BADFORMAT; } /* otherwise, we're ready to open the device */ return waveOutOpen((LPHWAVEOUT) hptr, devtoopen, &wfmt, 0L, 0L, (DWORD) CALLBACK_NULL); } else { return waveInOpen((LPHWAVEIN) hptr, devtoopen, &wfmt, 0L, 0L, (DWORD) CALLBACK_NULL); } }
UINT PASCAL _Cover_waveInOpen( HWAVEIN *lphWaveIn, UINT uDeviceID, const WAVEFORMAT *lpFormat, DWORD dwCallback, DWORD dwInstance, DWORD dwFlags ) { if( (dwFlags & CALLBACK_TYPEMASK) == CALLBACK_FUNCTION ) { CallBackFunc = dwCallback; dwCallback = GetWaveInCallBack( __WaveInCallBack ); } return( waveInOpen( lphWaveIn, uDeviceID, lpFormat, dwCallback, dwInstance, dwFlags ) ); }
int tdav_producer_waveapi_start(tmedia_producer_t* self) { tdav_producer_waveapi_t* producer = (tdav_producer_waveapi_t*)self; MMRESULT result; tsk_size_t i; if(!producer){ TSK_DEBUG_ERROR("Invalid parameter"); return -1; } if(producer->started || producer->hWaveIn){ TSK_DEBUG_WARN("Producer already started"); return 0; } /* create events */ if(!producer->events[0]){ producer->events[0] = CreateEvent(NULL, FALSE, FALSE, NULL); } if(!producer->events[1]){ producer->events[1] = CreateEvent(NULL, FALSE, FALSE, NULL); } /* open */ result = waveInOpen((HWAVEIN *)&producer->hWaveIn, /*WAVE_MAPPER*/0, &producer->wfx, (DWORD)producer->events[0], (DWORD_PTR)producer, CALLBACK_EVENT); if(result != MMSYSERR_NOERROR){ print_last_error(result, "waveInOpen"); return -2; } /* start */ result = waveInStart(producer->hWaveIn); if(result != MMSYSERR_NOERROR){ print_last_error(result, "waveInStart"); return -2; } /* start thread */ tsk_thread_create(&producer->tid[0], __record_thread, producer); /* write */ for(i = 0; i< sizeof(producer->hWaveHeaders)/sizeof(LPWAVEHDR); i++){ add_wavehdr(producer, i); } producer->started = tsk_true; return 0; }
BOOL WaveIn::open(int channels, int freq, int bits, HWND hWnd) { wfex.cbSize = sizeof(wfex); wfex.wFormatTag = WAVE_FORMAT_PCM; wfex.nChannels = channels; wfex.nSamplesPerSec = freq; wfex.wBitsPerSample = bits; wfex.nBlockAlign = wfex.nChannels * (wfex.wBitsPerSample / 8); wfex.nAvgBytesPerSec = wfex.nSamplesPerSec * wfex.nBlockAlign; MMRESULT res; res = waveInOpen(&hWaveIn, WAVE_MAPPER, &wfex, (DWORD) hWnd, 0, CALLBACK_WINDOW); return (res == MMSYSERR_NOERROR); }
static void wave_in_tests(void) { WAVEINCAPSA capsA; WAVEINCAPSW capsW; WAVEFORMATEX format; HWAVEIN win; MMRESULT rc; UINT ndev,d; ndev=waveInGetNumDevs(); trace("found %d WaveIn devices\n",ndev); rc=waveInGetDevCapsA(ndev+1,&capsA,sizeof(capsA)); ok(rc==MMSYSERR_BADDEVICEID, "waveInGetDevCapsA(%s): MMSYSERR_BADDEVICEID expected, got %s\n", dev_name(ndev+1),wave_in_error(rc)); rc=waveInGetDevCapsA(WAVE_MAPPER,&capsA,sizeof(capsA)); ok(rc==MMSYSERR_NOERROR || rc==MMSYSERR_NODRIVER || (!ndev && (rc==MMSYSERR_BADDEVICEID)), "waveInGetDevCapsA(%s): got %s\n",dev_name(WAVE_MAPPER),wave_in_error(rc)); rc=waveInGetDevCapsW(ndev+1,&capsW,sizeof(capsW)); ok(rc==MMSYSERR_BADDEVICEID || rc==MMSYSERR_NOTSUPPORTED, "waveInGetDevCapsW(%s): MMSYSERR_BADDEVICEID or MMSYSERR_NOTSUPPORTED " "expected, got %s\n",dev_name(ndev+1),wave_in_error(rc)); rc=waveInGetDevCapsW(WAVE_MAPPER,&capsW,sizeof(capsW)); ok(rc==MMSYSERR_NOERROR || rc==MMSYSERR_NODRIVER || rc==MMSYSERR_NOTSUPPORTED || (!ndev && (rc==MMSYSERR_BADDEVICEID)), "waveInGetDevCapsW(%s): got %s\n", dev_name(ndev+1),wave_in_error(rc)); format.wFormatTag=WAVE_FORMAT_PCM; format.nChannels=2; format.wBitsPerSample=16; format.nSamplesPerSec=44100; format.nBlockAlign=format.nChannels*format.wBitsPerSample/8; format.nAvgBytesPerSec=format.nSamplesPerSec*format.nBlockAlign; format.cbSize=0; rc=waveInOpen(&win,ndev+1,&format,0,0,CALLBACK_NULL); ok(rc==MMSYSERR_BADDEVICEID, "waveInOpen(%s): MMSYSERR_BADDEVICEID expected, got %s\n", dev_name(ndev+1),wave_in_error(rc)); for (d=0;d<ndev;d++) wave_in_test_device(d); if (ndev>0) wave_in_test_device(WAVE_MAPPER); }
BOOL Mic_Init_Physical() { if (Mic_Inited) return TRUE; Mic_Inited = FALSE; HRESULT hr; WAVEFORMATEX wfx; memset(Mic_TempBuf, 0x80, MIC_BUFSIZE); memset(Mic_Buffer[0], 0x80, MIC_BUFSIZE); memset(Mic_Buffer[1], 0x80, MIC_BUFSIZE); Mic_BufPos = 0; Mic_WriteBuf = 0; Mic_PlayBuf = 1; memset(&wfx, 0, sizeof(wfx)); wfx.cbSize = 0; wfx.wFormatTag = WAVE_FORMAT_PCM; wfx.nChannels = 1; wfx.nSamplesPerSec = 16000; wfx.nBlockAlign = 1; wfx.nAvgBytesPerSec = 16000; wfx.wBitsPerSample = 8; hr = waveInOpen(&waveIn, WAVE_MAPPER, &wfx, (DWORD_PTR)waveInProc, 0, CALLBACK_FUNCTION); MIC_CHECKERR(hr) memset(&waveHdr, 0, sizeof(waveHdr)); waveHdr.lpData = (LPSTR)Mic_TempBuf; waveHdr.dwBufferLength = MIC_BUFSIZE; hr = waveInPrepareHeader(waveIn, &waveHdr, sizeof(WAVEHDR)); MIC_CHECKERR(hr) hr = waveInAddBuffer(waveIn, &waveHdr, sizeof(WAVEHDR)); MIC_CHECKERR(hr) hr = waveInStart(waveIn); MIC_CHECKERR(hr) Mic_Inited = TRUE; INFO("win32 microphone init OK\n"); return TRUE; }
bool QWindowsAudioDeviceInfo::testSettings(const QAudioFormat& format) const { WAVEFORMATEXTENSIBLE wfx; if (qt_convertFormat(format, &wfx)) { // query only, do not open device if (mode == QAudio::AudioOutput) { return (waveOutOpen(NULL, UINT_PTR(devId), &wfx.Format, 0, 0, WAVE_FORMAT_QUERY) == MMSYSERR_NOERROR); } else { // AudioInput return (waveInOpen(NULL, UINT_PTR(devId), &wfx.Format, 0, 0, WAVE_FORMAT_QUERY) == MMSYSERR_NOERROR); } } return false; }
/*------------------------------------------------------------------------------*/ static void WaveInSync( void ) { MMRESULT mmRes; switch ( WaveInStep ){ case -1: dprintf( 0, 0, "WaveIn Error%d (0x%08X)", WaveInErrorStep, (int)WaveInErrorRes ); break; case 0: if ( SettingData.WAVOut != 0 ){ WaveInStep = 1; } break; case 1: mmRes = waveInOpen( &HdWaveIn, WaveInUse, &WaveFormat, (DWORD)WaveInProc, 0, CALLBACK_FUNCTION ); if ( WaveInErrorCheck( mmRes ) != FALSE ){ return; } WaveInStep = 2; break; case 2: if ( WaveInStatus == WAVEINSTATUS_OPEN ){ WaveInStep = 3; } break; case 3: WaveOutNextNo = 0; WaveOutLastNo = 0; for ( int i=0; i<WAVEBUFFNUM; i++ ){ WaveBuff[i].dwUser = 0; mmRes = waveInPrepareHeader( HdWaveIn, &WaveBuff[i],sizeof(WAVEHDR) ); if ( WaveInErrorCheck( mmRes ) != FALSE ){ return; } mmRes = waveInAddBuffer( HdWaveIn, &WaveBuff[i],sizeof(WAVEHDR) ); if ( WaveInErrorCheck( mmRes ) != FALSE ){ return; } } mmRes = waveInStart( HdWaveIn ); if ( WaveInErrorCheck( mmRes ) != FALSE ){ return; } WaveInStep = 4; break; case 4: if ( SettingData.WAVOut == 0 ){ WaveInStep = 5; } break; case 5: waveInStop( HdWaveIn ); waveInReset( HdWaveIn ); for ( int i=0; i<WAVEBUFFNUM; i++ ){ waveInUnprepareHeader( HdWaveIn, &WaveBuff[i], sizeof(WAVEHDR) ); } waveInClose( HdWaveIn ); WaveInStep = 0; break; } }
static int read_stream_open(struct ausrc_st *st, const struct ausrc_prm *prm, unsigned int dev) { WAVEFORMATEX wfmt; MMRESULT res; uint32_t sampc; int i, err = 0; /* Open an audio INPUT stream. */ st->wavein = NULL; st->bufs_idx = 0; sampc = prm->srate * prm->ch * prm->ptime / 1000; for (i = 0; i < READ_BUFFERS; i++) { memset(&st->bufs[i].wh, 0, sizeof(WAVEHDR)); st->bufs[i].mb = mbuf_alloc(2 * sampc); if (!st->bufs[i].mb) return ENOMEM; } wfmt.wFormatTag = WAVE_FORMAT_PCM; wfmt.nChannels = prm->ch; wfmt.nSamplesPerSec = prm->srate; wfmt.wBitsPerSample = 16; wfmt.nBlockAlign = (prm->ch * wfmt.wBitsPerSample) / 8; wfmt.nAvgBytesPerSec = wfmt.nSamplesPerSec * wfmt.nBlockAlign; wfmt.cbSize = 0; res = waveInOpen(&st->wavein, dev, &wfmt, (DWORD_PTR) waveInCallback, (DWORD_PTR) st, CALLBACK_FUNCTION | WAVE_FORMAT_DIRECT); if (res != MMSYSERR_NOERROR) { warning("sinwave: waveInOpen: failed %d\n", err); return EINVAL; } /* Prepare enough IN buffers to suite at least 50ms of data */ for (i = 0; i < READ_BUFFERS; i++) err |= add_wave_in(st); waveInStart(st->wavein); return err; }
int WaveInTest(UINT device, DWORD samplerate, WORD depth, WORD channels) { struct WAVEPARAMS wp; WAVEFORMATEX wf; MMRESULT mmr; HWAVEIN hw = 0; wp.dev_in = device; wp.channels = channels; wp.depth = depth; wp.length = 0; wp.samplerate = samplerate; WaveGetFormat(&wp, &wf, 0); if ((mmr = waveInOpen(&hw, device, &wf, 0, 0, WAVE_FORMAT_DIRECT | WAVE_FORMAT_QUERY | CALLBACK_NULL)) != MMSYSERR_NOERROR) return 0; if (hw) waveInClose(hw); return 1; }
tbool CDeviceWaveIO::Start() { if ((mpCallback) && (mppfOutputs) && (mppfInputs) && (!mbStarted)) { mbBlocked_volatile = false; WAVEFORMATEX format; InitFormat(format, (tint32)mfSampleRate); // Open driver - will invoke waveOutProc_static with uMsg == WOM_OPEN MMRESULT mmres = MMSYSERR_ERROR; // Default => unspecified error if (mbOutput) { mmres = waveOutOpen( &mhandle_out, muiDevIndex, &format, (DWORD)&waveOutProc_static, (DWORD)this, CALLBACK_FUNCTION ); } if (mbInput) { mmres = waveInOpen( &mhandle_in, muiDevIndex, &format, (DWORD)&waveInProc_static, (DWORD)this, CALLBACK_FUNCTION ); } /* MMSYSERR_ALLOCATED Specified resource is already allocated. MMSYSERR_BADDEVICEID Specified device identifier is out of range. MMSYSERR_NODRIVER No device driver is present. MMSYSERR_NOMEM Unable to allocate or lock memory. WAVERR_BADFORMAT */ if (mmres == MMSYSERR_NOERROR) { mmres = StartBuffers(); } mbStarted = (mmres == MMSYSERR_NOERROR); if (!mbStarted) mbBlocked_volatile = true; } return mbStarted; } // Start
MMRESULT RUIAudio::RecordOpen() { RecordClose(); #ifdef _USE_AAC_CODEC if (m_pRUIBufferListPCM == NULL) { ASSERT(FALSE); return ERR_AUDIO_RECORD_CANTADD; } #else if (m_pRUIBufferListSend == NULL) { ASSERT(FALSE); return ERR_AUDIO_RECORD_CANTADD; } #endif if (waveInOpen(&m_hWaveIn, m_nWaveInDeviceID, (LPWAVEFORMATEX) &m_WFEX, (ULONG) WaveInProc, (DWORD_PTR) this, CALLBACK_FUNCTION) != MMSYSERR_NOERROR) return ERR_AUDIO_RECORD_CANTOPEN; RUIBuffer* pRUIBuffer = m_RecBufferList.GetFirst(); while (pRUIBuffer != NULL) { RecAddBuffer(pRUIBuffer); pRUIBuffer = m_RecBufferList.GetNext(pRUIBuffer); } if (waveInStart(m_hWaveIn) != MMSYSERR_NOERROR) { RecordClose(); return ERR_AUDIO_RECORD_CANTOPEN; } if (! Run()) { ASSERT(FALSE); RecordClose(); return ERR_AUDIO_THREAD_DOESNTEXIST; } return ERR_AUDIO_NOERROR; }
/* * Class: com_ibm_media_protocol_device_DataSource * Method: connectDevice * Signature: ()V */ JNIEXPORT void JNICALL Java_com_ibm_media_protocol_device_DataSource_connectDevice (JNIEnv* env, jobject obj) { MMRESULT result; /* we current use PCM, 22050, mono, 16bit */ WAVEFORMATEX format = { WAVE_FORMAT_PCM, 1, 22050, 44100 , 2, 16, 0 }; result = waveInOpen(&hwi, WAVE_MAPPER, &format, waveInProc, NULL, CALLBACK_FUNCTION); /* DEBUG */ if (result == MMSYSERR_NOERROR) printf("Device opened\n"); else printf("ERROR opening device !"); /* end DEBUG */ printf("no. of devices %d\n", waveInGetNumDevs()); }
static int read_stream_open(struct ausrc_st *st) { MMRESULT err; WAVEFORMATEX wfmt; int i; /* Open an audio INPUT stream. */ st->wavein = NULL; st->pos = 0; st->rdy = false; st->stop = false; for (i = 0; i < READ_BUFFERS; i++) { memset(&st->bufs[i].wh, 0, sizeof(WAVEHDR)); st->bufs[i].mb = mbuf_alloc(2 * st->prm.frame_size); } wfmt.wFormatTag = WAVE_FORMAT_PCM; wfmt.nChannels = st->prm.ch; wfmt.nSamplesPerSec = st->prm.srate; wfmt.wBitsPerSample = 16; wfmt.nBlockAlign = (st->prm.ch * wfmt.wBitsPerSample) / 8; wfmt.nAvgBytesPerSec = wfmt.nSamplesPerSec * wfmt.nBlockAlign; wfmt.cbSize = 0; err = waveInOpen(&st->wavein, WAVE_MAPPER, &wfmt, (DWORD_PTR) waveInCallback, (DWORD_PTR) st, CALLBACK_FUNCTION | WAVE_FORMAT_DIRECT); if (err != MMSYSERR_NOERROR) { DEBUG_WARNING("waveInOpen: failed %d\n", err); return EINVAL; } /* Prepare enough IN buffers to suite at least 50ms of data */ for (i = 0; i < READ_BUFFERS; i++) add_wave_in(st); waveInStart(st->wavein); return 0; }
void RecordChannelOnOff(void) { static int ChannelOn = 0; int err; WAVEFORMATEX fmt; WAVEHDR hdr; HWAVEIN hwi = 1; fmt.nSamplesPerSec = 22050; fmt.wBitsPerSample = 16; fmt.wFormatTag = WAVE_FORMAT_PCM; fmt.nChannels = 2; fmt.nBlockAlign = fmt.wBitsPerSample*fmt.nChannels/8; fmt.nAvgBytesPerSec = fmt.nSamplesPerSec*fmt.nBlockAlign; hdr.lpData = (LPSTR)0x30800000;//_NONCACHE_STARTADDRESS; hdr.dwBufferLength = BUF_SIZE; if(!ChannelOn) { err = waveInOpen(&hwi, 0, &fmt, 0, 0, 0); Uart_Printf("\nerr = %x\n", err); if(!err) { waveInAddBuffer(hwi, &hdr, 0); waveInStart(hwi); Uart_Printf("Record channel on\n"); ChannelOn = 1; } } else { waveInClose(hwi); Uart_Printf("Record channel off\n"); ChannelOn = 0; } }