SourceInfoWidget::SourceInfoWidget( const Tomahawk::source_ptr& source, QWidget* parent ) : QWidget( parent ) , ui( new Ui::SourceInfoWidget ) { ui->setupUi( this ); ui->historyView->overlay()->setEnabled( false ); m_recentCollectionModel = new CollectionFlatModel( ui->recentCollectionView ); ui->recentCollectionView->setModel( m_recentCollectionModel ); m_recentCollectionModel->addFilteredCollection( source->collection(), 250, DatabaseCommand_AllTracks::ModificationTime ); m_historyModel = new PlaylistModel( ui->historyView ); ui->historyView->setModel( m_historyModel ); m_historyModel->loadHistory( source ); connect( source.data(), SIGNAL( playbackFinished( Tomahawk::query_ptr ) ), SLOT( onPlaybackFinished( Tomahawk::query_ptr ) ) ); ui->recentCollectionView->setColumnHidden( TrackModel::Bitrate, true ); ui->recentCollectionView->setColumnHidden( TrackModel::Origin, true ); ui->recentCollectionView->setColumnHidden( TrackModel::Filesize, true ); ui->historyView->setColumnHidden( TrackModel::Bitrate, true ); ui->historyView->setColumnHidden( TrackModel::Origin, true ); ui->historyView->setColumnHidden( TrackModel::Filesize, true ); m_recentAlbumModel = new AlbumModel( ui->recentAlbumView ); ui->recentAlbumView->setModel( m_recentAlbumModel ); m_recentAlbumModel->addFilteredCollection( source->collection(), 20, DatabaseCommand_AllAlbums::ModificationTime ); m_title = tr( "Info about %1" ).arg( source->isLocal() ? tr( "Your Collection" ) : source->friendlyName() ); }
void PlaybackActionClient::playbackDoneCallback(const actionlib::SimpleClientGoalState &goal, const pumpkin_messages::PlaybackResultConstPtr &result) { _running = false; if (_scene) Q_EMIT(blockOnScene(false)); else Q_EMIT(blockOnPlayback(false)); switch (static_cast<pumpkin_messages::IOState>(result->state)) { case pumpkin_messages::IOState::BadFile: Q_EMIT(sendStatusMessage(QString("Error: BAD. The playback file should be corrupted."), 1000)); ROS_ERROR("BAD FILE"); break; case pumpkin_messages::IOState::EndOfFile: Q_EMIT(sendStatusMessage(QString("Error: EOF. Reached End Of File before finishing the playback. Maybe ocurred some problems in recording."), 1000)); ROS_ERROR("END OF FILE"); break; case pumpkin_messages::IOState::ErrorOpening: Q_EMIT(sendStatusMessage(QString("Error: OPEN. There was an error opening file."), 1000)); ROS_ERROR("ERROR OPENING FILE"); break; case pumpkin_messages::IOState::OK: Q_EMIT(sendStatusMessage(QString("Playback terminated successfully."), 1000)); ROS_INFO("Playback executed with success."); break; default: Q_EMIT(sendStatusMessage(QString("Fatal: UNKNOWN ERROR!"), 1000)); ROS_FATAL("UNKNOWN ERROR"); } if (_scene) Q_EMIT(scenePercentage(0, 0, 0)); else Q_EMIT(playbackPercentage(0)); Q_EMIT(playbackFinished(result->state)); }
void MediaView::setMediaObject(Phonon::MediaObject *mediaObject) { this->mediaObject = mediaObject; Phonon::createPath(mediaObject, videoWidget); connect(mediaObject, SIGNAL(finished()), SLOT(playbackFinished())); connect(mediaObject, SIGNAL(stateChanged(Phonon::State, Phonon::State)), SLOT(stateChanged(Phonon::State, Phonon::State))); connect(mediaObject, SIGNAL(aboutToFinish()), SLOT(aboutToFinish())); }
AudioPlayer::AudioPlayer() { #ifdef WIN32 backend = new PortAudioPlayerBackend(); #endif #ifdef UNIX backend = new PulseAudioPlayerBackend(); #endif connect(backend, SIGNAL(playbackFinished()), this, SIGNAL(finished()), Qt::QueuedConnection); }
QT_BEGIN_NAMESPACE QGstreamerPlayerControl::QGstreamerPlayerControl(QGstreamerPlayerSession *session, QObject *parent) : QMediaPlayerControl(parent) , m_ownStream(false) , m_session(session) , m_userRequestedState(QMediaPlayer::StoppedState) , m_currentState(QMediaPlayer::StoppedState) , m_mediaStatus(QMediaPlayer::NoMedia) , m_bufferProgress(-1) , m_pendingSeekPosition(-1) , m_setMediaPending(false) , m_stream(0) { m_resources = QMediaResourcePolicy::createResourceSet<QMediaPlayerResourceSetInterface>(); Q_ASSERT(m_resources); connect(m_session, SIGNAL(positionChanged(qint64)), this, SIGNAL(positionChanged(qint64))); connect(m_session, SIGNAL(durationChanged(qint64)), this, SIGNAL(durationChanged(qint64))); connect(m_session, SIGNAL(mutedStateChanged(bool)), this, SIGNAL(mutedChanged(bool))); connect(m_session, SIGNAL(volumeChanged(int)), this, SIGNAL(volumeChanged(int))); connect(m_session, SIGNAL(stateChanged(QMediaPlayer::State)), this, SLOT(updateSessionState(QMediaPlayer::State))); connect(m_session,SIGNAL(bufferingProgressChanged(int)), this, SLOT(setBufferProgress(int))); connect(m_session, SIGNAL(playbackFinished()), this, SLOT(processEOS())); connect(m_session, SIGNAL(audioAvailableChanged(bool)), this, SIGNAL(audioAvailableChanged(bool))); connect(m_session, SIGNAL(videoAvailableChanged(bool)), this, SIGNAL(videoAvailableChanged(bool))); connect(m_session, SIGNAL(seekableChanged(bool)), this, SIGNAL(seekableChanged(bool))); connect(m_session, SIGNAL(error(int,QString)), this, SIGNAL(error(int,QString))); connect(m_session, SIGNAL(invalidMedia()), this, SLOT(handleInvalidMedia())); connect(m_session, SIGNAL(playbackRateChanged(qreal)), this, SIGNAL(playbackRateChanged(qreal))); connect(m_resources, SIGNAL(resourcesGranted()), SLOT(handleResourcesGranted())); //denied signal should be queued to have correct state update process, //since in playOrPause, when acquire is call on resource set, it may trigger a resourcesDenied signal immediately, //so handleResourcesDenied should be processed later, otherwise it will be overwritten by state update later in playOrPause. connect(m_resources, SIGNAL(resourcesDenied()), this, SLOT(handleResourcesDenied()), Qt::QueuedConnection); connect(m_resources, SIGNAL(resourcesLost()), SLOT(handleResourcesLost())); }
SourceInfoWidget::SourceInfoWidget( const Tomahawk::source_ptr& source, QWidget* parent ) : QWidget( parent ) , ui( new Ui::SourceInfoWidget ) , m_source( source ) { ui->setupUi( this ); ui->historyView->setFrameShape( QFrame::NoFrame ); ui->historyView->setAttribute( Qt::WA_MacShowFocusRect, 0 ); ui->recentAlbumView->setFrameShape( QFrame::NoFrame ); ui->recentAlbumView->setAttribute( Qt::WA_MacShowFocusRect, 0 ); ui->recentCollectionView->setFrameShape( QFrame::NoFrame ); ui->recentCollectionView->setAttribute( Qt::WA_MacShowFocusRect, 0 ); TomahawkUtils::unmarginLayout( layout() ); ui->historyView->overlay()->setEnabled( false ); m_recentCollectionModel = new CollectionFlatModel( ui->recentCollectionView ); m_recentCollectionModel->setStyle( TrackModel::Short ); ui->recentCollectionView->setTrackModel( m_recentCollectionModel ); ui->recentCollectionView->sortByColumn( TrackModel::Age, Qt::DescendingOrder ); m_historyModel = new PlaylistModel( ui->historyView ); m_historyModel->setStyle( TrackModel::Short ); ui->historyView->setPlaylistModel( m_historyModel ); m_historyModel->loadHistory( source, 25 ); m_recentAlbumModel = new AlbumModel( ui->recentAlbumView ); ui->recentAlbumView->setAlbumModel( m_recentAlbumModel ); ui->recentAlbumView->proxyModel()->sort( -1 ); onCollectionChanged(); connect( source->collection().data(), SIGNAL( changed() ), SLOT( onCollectionChanged() ) ); connect( source.data(), SIGNAL( playbackFinished( Tomahawk::query_ptr ) ), SLOT( onPlaybackFinished( Tomahawk::query_ptr ) ) ); m_title = tr( "New Additions" ); if ( source->isLocal() ) { m_description = tr( "My recent activity" ); } else { m_description = tr( "Recent activity from %1" ).arg( source->friendlyName() ); } m_pixmap.load( RESPATH "images/new-additions.png" ); }
QT_BEGIN_NAMESPACE QGstreamerPlayerControl::QGstreamerPlayerControl(QGstreamerPlayerSession *session, QObject *parent) : QMediaPlayerControl(parent) , m_session(session) , m_state(QMediaPlayer::StoppedState) , m_mediaStatus(QMediaPlayer::NoMedia) , m_bufferProgress(-1) , m_stream(0) , m_fifoNotifier(0) , m_fifoCanWrite(false) , m_bufferSize(0) , m_bufferOffset(0) { m_fifoFd[0] = -1; m_fifoFd[1] = -1; connect(m_session, SIGNAL(positionChanged(qint64)), this, SIGNAL(positionChanged(qint64))); connect(m_session, SIGNAL(durationChanged(qint64)), this, SIGNAL(durationChanged(qint64))); connect(m_session, SIGNAL(mutedStateChanged(bool)), this, SIGNAL(mutedChanged(bool))); connect(m_session, SIGNAL(volumeChanged(int)), this, SIGNAL(volumeChanged(int))); connect(m_session, SIGNAL(stateChanged(QMediaPlayer::State)), this, SLOT(updateState(QMediaPlayer::State))); connect(m_session,SIGNAL(bufferingProgressChanged(int)), this, SLOT(setBufferProgress(int))); connect(m_session, SIGNAL(playbackFinished()), this, SLOT(processEOS())); connect(m_session, SIGNAL(audioAvailableChanged(bool)), this, SIGNAL(audioAvailableChanged(bool))); connect(m_session, SIGNAL(videoAvailableChanged(bool)), this, SIGNAL(videoAvailableChanged(bool))); connect(m_session, SIGNAL(seekableChanged(bool)), this, SIGNAL(seekableChanged(bool))); connect(m_session, SIGNAL(error(int,QString)), this, SIGNAL(error(int,QString))); }
SourceInfoWidget::SourceInfoWidget( const Tomahawk::source_ptr& source, QWidget* parent ) : QWidget( parent ) , ui( new Ui::SourceInfoWidget ) { ui->setupUi( this ); ui->historyView->setFrameShape( QFrame::NoFrame ); ui->historyView->setAttribute( Qt::WA_MacShowFocusRect, 0 ); ui->recentAlbumView->setFrameShape( QFrame::NoFrame ); ui->recentAlbumView->setAttribute( Qt::WA_MacShowFocusRect, 0 ); ui->recentCollectionView->setFrameShape( QFrame::NoFrame ); ui->recentCollectionView->setAttribute( Qt::WA_MacShowFocusRect, 0 ); TomahawkUtils::unmarginLayout( layout() ); ui->historyView->overlay()->setEnabled( false ); m_recentCollectionModel = new CollectionFlatModel( ui->recentCollectionView ); m_recentCollectionModel->setStyle( TrackModel::Short ); ui->recentCollectionView->setTrackModel( m_recentCollectionModel ); m_recentCollectionModel->addFilteredCollection( source->collection(), 250, DatabaseCommand_AllTracks::ModificationTime ); m_historyModel = new PlaylistModel( ui->historyView ); m_historyModel->setStyle( TrackModel::Short ); ui->historyView->setPlaylistModel( m_historyModel ); m_historyModel->loadHistory( source, 25 ); connect( source.data(), SIGNAL( playbackFinished( Tomahawk::query_ptr ) ), SLOT( onPlaybackFinished( Tomahawk::query_ptr ) ) ); m_recentAlbumModel = new AlbumModel( ui->recentAlbumView ); ui->recentAlbumView->setAlbumModel( m_recentAlbumModel ); m_recentAlbumModel->addFilteredCollection( source->collection(), 20, DatabaseCommand_AllAlbums::ModificationTime ); m_title = tr( "New Additions" ); m_description = tr( "Recent activity from %1" ).arg( source->isLocal() ? tr( "Your Collection" ) : source->friendlyName() ); m_pixmap.load( RESPATH "images/new-additions.png" ); }
void QGstreamerPlayerSession::busMessage(const QGstreamerMessage &message) { GstMessage* gm = message.rawMessage(); if (gm == 0) { // Null message, query current position quint32 newPos = position(); if (newPos/1000 != m_lastPosition) { m_lastPosition = newPos/1000; emit positionChanged(newPos); } double volume = 1.0; g_object_get(G_OBJECT(m_playbin), "volume", &volume, NULL); if (m_volume != int(volume*100)) { m_volume = int(volume*100); emit volumeChanged(m_volume); } } else { //tag message comes from elements inside playbin, not from playbin itself if (GST_MESSAGE_TYPE(gm) == GST_MESSAGE_TAG) { //qDebug() << "tag message"; GstTagList *tag_list; gst_message_parse_tag(gm, &tag_list); gst_tag_list_foreach(tag_list, addTagToMap, &m_tags); //qDebug() << m_tags; emit tagsChanged(); } if (GST_MESSAGE_SRC(gm) == GST_OBJECT_CAST(m_playbin)) { switch (GST_MESSAGE_TYPE(gm)) { case GST_MESSAGE_STATE_CHANGED: { GstState oldState; GstState newState; GstState pending; gst_message_parse_state_changed(gm, &oldState, &newState, &pending); #ifdef DEBUG_PLAYBIN QStringList states; states << "GST_STATE_VOID_PENDING" << "GST_STATE_NULL" << "GST_STATE_READY" << "GST_STATE_PAUSED" << "GST_STATE_PLAYING"; qDebug() << QString("state changed: old: %1 new: %2 pending: %3") \ .arg(states[oldState]) \ .arg(states[newState]) \ .arg(states[pending]); #endif switch (newState) { case GST_STATE_VOID_PENDING: case GST_STATE_NULL: setSeekable(false); finishVideoOutputChange(); if (m_state != QMediaPlayer::StoppedState) emit stateChanged(m_state = QMediaPlayer::StoppedState); break; case GST_STATE_READY: setSeekable(false); if (m_state != QMediaPlayer::StoppedState) emit stateChanged(m_state = QMediaPlayer::StoppedState); break; case GST_STATE_PAUSED: { QMediaPlayer::State prevState = m_state; m_state = QMediaPlayer::PausedState; //check for seekable if (oldState == GST_STATE_READY) { getStreamsInfo(); updateVideoResolutionTag(); /* //gst_element_seek_simple doesn't work reliably here, have to find a better solution GstFormat format = GST_FORMAT_TIME; gint64 position = 0; bool seekable = false; if (gst_element_query_position(m_playbin, &format, &position)) { seekable = gst_element_seek_simple(m_playbin, format, GST_SEEK_FLAG_NONE, position); } setSeekable(seekable); */ setSeekable(true); if (!qFuzzyCompare(m_playbackRate, qreal(1.0))) { qreal rate = m_playbackRate; m_playbackRate = 1.0; setPlaybackRate(rate); } } if (m_state != prevState) emit stateChanged(m_state); break; } case GST_STATE_PLAYING: if (m_state != QMediaPlayer::PlayingState) emit stateChanged(m_state = QMediaPlayer::PlayingState); break; } } break; case GST_MESSAGE_EOS: emit playbackFinished(); break; case GST_MESSAGE_TAG: case GST_MESSAGE_STREAM_STATUS: case GST_MESSAGE_UNKNOWN: break; case GST_MESSAGE_ERROR: { GError *err; gchar *debug; gst_message_parse_error (gm, &err, &debug); emit error(int(QMediaPlayer::ResourceError), QString::fromUtf8(err->message)); qWarning() << "Error:" << QString::fromUtf8(err->message); g_error_free (err); g_free (debug); } break; case GST_MESSAGE_WARNING: { GError *err; gchar *debug; gst_message_parse_warning (gm, &err, &debug); qWarning() << "Warning:" << QString::fromUtf8(err->message); g_error_free (err); g_free (debug); } break; case GST_MESSAGE_INFO: #ifdef DEBUG_PLAYBIN { GError *err; gchar *debug; gst_message_parse_info (gm, &err, &debug); qDebug() << "Info:" << QString::fromUtf8(err->message); g_error_free (err); g_free (debug); } #endif break; case GST_MESSAGE_BUFFERING: { int progress = 0; gst_message_parse_buffering(gm, &progress); emit bufferingProgressChanged(progress); } break; case GST_MESSAGE_STATE_DIRTY: case GST_MESSAGE_STEP_DONE: case GST_MESSAGE_CLOCK_PROVIDE: case GST_MESSAGE_CLOCK_LOST: case GST_MESSAGE_NEW_CLOCK: case GST_MESSAGE_STRUCTURE_CHANGE: case GST_MESSAGE_APPLICATION: case GST_MESSAGE_ELEMENT: break; case GST_MESSAGE_SEGMENT_START: { const GstStructure *structure = gst_message_get_structure(gm); qint64 position = g_value_get_int64(gst_structure_get_value(structure, "position")); position /= 1000000; m_lastPosition = position; emit positionChanged(position); } break; case GST_MESSAGE_SEGMENT_DONE: break; case GST_MESSAGE_DURATION: { GstFormat format = GST_FORMAT_TIME; gint64 duration = 0; if (gst_element_query_duration(m_playbin, &format, &duration)) { int newDuration = duration / 1000000; if (m_duration != newDuration) { m_duration = newDuration; emit durationChanged(m_duration); } } } break; case GST_MESSAGE_LATENCY: #if (GST_VERSION_MAJOR >= 0) && (GST_VERSION_MINOR >= 10) && (GST_VERSION_MICRO >= 13) case GST_MESSAGE_ASYNC_START: case GST_MESSAGE_ASYNC_DONE: #if GST_VERSION_MICRO >= 23 case GST_MESSAGE_REQUEST_STATE: #endif #endif case GST_MESSAGE_ANY: break; default: break; } } else if (m_videoSink && m_renderer && GST_MESSAGE_SRC(gm) == GST_OBJECT_CAST(m_videoSink) && GST_MESSAGE_TYPE(gm) == GST_MESSAGE_STATE_CHANGED) { GstState oldState; GstState newState; gst_message_parse_state_changed(gm, &oldState, &newState, 0); if (oldState == GST_STATE_READY && newState == GST_STATE_PAUSED) m_renderer->precessNewStream(); } } }
void MPlayerMediaWidget::readStandardOutput() { QByteArray data = process.readAllStandardOutput(); standardError.write(data); // forward standardError.flush(); if ((data == "\n") || (data.indexOf("\n\n") >= 0)) { process.write("pausing_keep_force get_property path\n"); } bool videoPropertiesChanged = false; QStringList audioStreams = getAudioStreams(); bool audioStreamsChanged = false; QStringList subtitles = getSubtitles(); bool subtitlesChanged = false; foreach (const QByteArray &line, data.split('\n')) { if (line.startsWith("VO: ")) { videoPropertiesChanged = true; continue; } if (line.startsWith("audio stream: ")) { int begin = 14; int end = line.indexOf(' ', begin); if (end < 0) { end = line.size(); } int audioStreamIndex = line.mid(begin, end - begin).toInt(); while (audioStreams.size() < audioStreamIndex) { audioStreams.append(QString::number(audioStreams.size() + 1)); } while (audioIds.size() < audioStreamIndex) { audioIds.append(-1); } audioStreams.erase(audioStreams.begin() + audioStreamIndex, audioStreams.end()); audioIds.erase(audioIds.begin() + audioStreamIndex, audioIds.end()); QString audioStream; begin = line.indexOf("language: "); if (begin >= 0) { begin += 10; end = line.indexOf(' ', begin); if (end < 0) { end = line.size(); } audioStream = line.mid(begin, end - begin); } if (audioStream.isEmpty()) { audioStream = QString::number(audioStreams.size() + 1); } int audioId = -1; begin = line.indexOf("aid: "); if (begin >= 0) { begin += 5; end = line.indexOf('.', begin); if (end < 0) { end = line.size(); } audioId = line.mid(begin, end - begin).toInt(); } audioStreams.append(audioStream); audioIds.append(audioId); audioStreamsChanged = true; continue; } if (line.startsWith("subtitle ")) { int begin = line.indexOf("( sid ): "); if (begin < 0) { continue; } begin += 9; int end = line.indexOf(' ', begin); if (end < 0) { end = line.size(); } int subtitleIndex = line.mid(begin, end - begin).toInt(); while (subtitles.size() < subtitleIndex) { subtitles.append(QString::number(subtitles.size() + 1)); } subtitles.erase(subtitles.begin() + subtitleIndex, subtitles.end()); QString subtitle; begin = line.indexOf("language: "); if (begin >= 0) { begin += 10; end = line.indexOf(' ', begin); if (end < 0) { end = line.size(); } subtitle = line.mid(begin, end - begin); } if (subtitle.isEmpty()) { subtitle = QString::number(subtitles.size() + 1); } subtitles.append(subtitle); subtitlesChanged = true; continue; } if (line == "ANS_path=(null)") { switch (getPlaybackStatus()) { case MediaWidget::Idle: break; case MediaWidget::Playing: case MediaWidget::Paused: playbackFinished(); break; } resetState(); continue; } if (line.startsWith("ANS_length=")) { int totalTime = (line.mid(11).toFloat() * 1000 + 0.5); updateCurrentTotalTime(getCurrentTime(), totalTime); continue; } if (line.startsWith("ANS_time_pos=")) { int currentTime = (line.mid(13).toFloat() * 1000 + 0.5); updateCurrentTotalTime(currentTime, getTotalTime()); continue; } if (line.startsWith("ANS_width=")) { videoWidth = line.mid(10).toInt(); if (videoWidth < 0) { videoWidth = 0; } continue; } if (line.startsWith("ANS_height=")) { videoHeight = line.mid(11).toInt(); if (videoHeight < 0) { videoHeight = 0; } continue; } if (line.startsWith("ANS_aspect=")) { videoAspectRatio = line.mid(11).toFloat(); if ((videoAspectRatio > 0.01) && (videoAspectRatio < 100)) { // ok } else { videoAspectRatio = (videoWidth / float(videoHeight)); if ((videoAspectRatio > 0.01) && (videoAspectRatio < 100)) { // ok } else { videoAspectRatio = 1; } } updateVideoWidgetGeometry(); continue; } if (line.startsWith("ANS_switch_audio=")) { int audioId = line.mid(17).toInt(); updateCurrentAudioStream(audioIds.indexOf(audioId)); continue; } if (line.startsWith("ANS_sub=")) { int currentSubtitle = line.mid(8).toInt(); updateCurrentSubtitle(currentSubtitle); continue; } } if (videoPropertiesChanged) { process.write("pausing_keep_force get_property width\n" "pausing_keep_force get_property height\n" "pausing_keep_force get_property aspect\n"); } if (audioStreamsChanged) { updateAudioStreams(audioStreams); process.write("pausing_keep_force get_property switch_audio\n"); } if (subtitlesChanged) { updateSubtitles(subtitles); process.write("pausing_keep_force get_property sub\n"); } }
bool AudioOutputSample::needSamples(unsigned int snum) { // Forward the buffer for (unsigned int i=iLastConsume;i<iBufferFilled;++i) pfBuffer[i-iLastConsume]=pfBuffer[i]; iBufferFilled -= iLastConsume; iLastConsume = snum; // Check if we can satisfy request with current buffer if (iBufferFilled >= snum) return true; // Calculate the required buffersize to hold the results unsigned int iInputFrames = static_cast<unsigned int>(ceilf(static_cast<float>(snum * sfHandle->samplerate()) / static_cast<float>(iOutSampleRate))); unsigned int iInputSamples = iInputFrames * sfHandle->channels(); float *pOut; bool mix = sfHandle->channels() > 1; STACKVAR(float, fOut, iInputSamples); STACKVAR(float, fMix, iInputFrames); bool eof = false; sf_count_t read; do { resizeBuffer(iBufferFilled + snum); // If we need to resample or mix write to the buffer on stack pOut = (srs || mix) ? fOut : pfBuffer + iBufferFilled; // Try to read all samples needed to satifsy this request if ((read = sfHandle->read(pOut, iInputSamples)) < iInputSamples) { if (sfHandle->error() != SF_ERR_NO_ERROR || !bLoop) { // We reached the eof or encountered an error, stuff with zeroes memset(pOut, 0, sizeof(float) * (iInputSamples - read)); read = iInputSamples; eof = true; } else { sfHandle->seek(SEEK_SET, 0); } } if (mix) { // Mix the channels (only two channels) read /= 2; // If we need to resample after this write to extra buffer pOut = srs ? fMix : pfBuffer + iBufferFilled; for (unsigned int i = 0; i < read; i++) pOut[i] = (fOut[i*2] + fOut[i*2 + 1]) * 0.5f; } spx_uint32_t inlen = static_cast<unsigned int>(read); spx_uint32_t outlen = snum; if (srs) // If necessary resample speex_resampler_process_float(srs, 0, pOut, &inlen, pfBuffer + iBufferFilled, &outlen); iBufferFilled += outlen; } while (iBufferFilled < snum); if (eof && !bEof) { emit playbackFinished(); bEof = true; } return !eof; }
void WelcomeWidget::onSourceAdded( const Tomahawk::source_ptr& source ) { connect( source.data(), SIGNAL( playbackFinished( Tomahawk::query_ptr ) ), SLOT( onPlaybackFinished( Tomahawk::query_ptr ) ) ); }
AudioPlayer::~AudioPlayer(){ disconnect(backend, SIGNAL(playbackFinished()), this, SIGNAL(finished())); delete backend; }
void QAndroidPlayerSession::playFinshed() { emit playbackFinished(); m_helper->doTheWork(2,0); }