OnlineItemInfo ArchiveOrg::displayItemDetails(QListWidgetItem *item) { OnlineItemInfo info; m_metaInfo.clear(); if (!item) { return info; } info.itemPreview = item->data(previewRole).toString(); info.itemDownload = item->data(downloadRole).toString(); info.itemId = item->data(idRole).toInt(); info.itemName = item->text(); info.infoUrl = item->data(infoUrl).toString(); info.author = item->data(authorRole).toString(); info.authorUrl = item->data(authorUrl).toString(); info.license = item->data(licenseRole).toString(); info.description = item->data(descriptionRole).toString(); m_metaInfo.insert(QStringLiteral("url"), info.itemDownload); m_metaInfo.insert(QStringLiteral("id"), info.itemId); QString extraInfoUrl = item->data(downloadRole).toString(); if (!extraInfoUrl.isEmpty()) { KJob* resolveJob = KIO::storedGet( QUrl(extraInfoUrl), KIO::NoReload, KIO::HideProgressInfo ); resolveJob->setProperty("id", info.itemId); connect(resolveJob, &KJob::result, this, &ArchiveOrg::slotParseResults); } return info; }
void FileObjectEditDialog::saveAndMergeUrlChange() { QString newUrl = ui->editUrl->fullText(); QString existingUrl = m_fileObject.property(NIE::url()).toString(); if(newUrl == existingUrl) { return; } if(!newUrl.isEmpty()) { QString query = "Select DISTINCT ?r where {" "?r nie:url ?url . FILTER ( regex(?url, \"^" + newUrl + "$\"))" "}"; QList<Nepomuk2::Query::Result> queryResult = Nepomuk2::Query::QueryServiceClient::syncSparqlQuery(query); if(!queryResult.isEmpty() && queryResult.first().resource().uri() != m_fileObject.uri()) { kDebug() << "found a duplicate with url" << newUrl << "merge it"; KJob *job = Nepomuk2::mergeResources(queryResult.first().resource().uri(), m_fileObject.uri()); job->exec(); if(job->error() != 0) { kDebug() << job->errorString() << job->errorText(); } setResource(queryResult.first().resource()); } else { kDebug() << "set url to " << newUrl; QList<QUrl> fileObjectUri; fileObjectUri << m_fileObject.uri(); QVariantList fileObjectValue; fileObjectValue << newUrl; Nepomuk2::setProperty(fileObjectUri, NIE::url(), fileObjectValue); } } }
void ArchiveOrg::slotParseResults(KJob* job) { KIO::StoredTransferJob* storedQueryJob = static_cast<KIO::StoredTransferJob*>( job ); QDomDocument doc; doc.setContent(QString::fromUtf8(storedQueryJob->data())); QDomNodeList links = doc.elementsByTagName(QStringLiteral("a")); QString html = QStringLiteral("<style type=\"text/css\">tr.cellone {background-color: %1;}").arg(qApp->palette().alternateBase().color().name()); html += QLatin1String("</style><table width=\"100%\" cellspacing=\"0\" cellpadding=\"2\">"); QString link; int ct = 0; m_thumbsPath.clear(); for (int i = 0; i < links.count(); ++i) { QString href = links.at(i).toElement().attribute(QStringLiteral("href")); if (href.endsWith(QLatin1String(".thumbs/"))) { // sub folder contains image thumbs, display one. m_thumbsPath = m_metaInfo.value(QStringLiteral("url")) + '/' + href; KJob* thumbJob = KIO::storedGet( QUrl(m_thumbsPath), KIO::NoReload, KIO::HideProgressInfo ); thumbJob->setProperty("id", m_metaInfo.value(QStringLiteral("id"))); connect(thumbJob, &KJob::result, this, &ArchiveOrg::slotParseThumbs); } else if (!href.contains('/') && !href.endsWith(QLatin1String(".xml"))) { link = m_metaInfo.value(QStringLiteral("url")) + '/' + href; ct++; if (ct %2 == 0) { html += QLatin1String("<tr class=\"cellone\">"); } else html += QLatin1String("<tr>"); html += "<td>" + QUrl(link).fileName() + QStringLiteral("</td><td><a href=\"%1\">%2</a></td><td><a href=\"%3\">%4</a></td></tr>").arg(link).arg(i18n("Preview")).arg(link + "_import").arg(i18n("Import")); } } html += QLatin1String("</table>"); if (m_metaInfo.value(QStringLiteral("id")) == job->property("id").toString()) emit gotMetaInfo(html); }
void IpodCopyTracksJob::slotStartOrTranscodeCopyJob( const KUrl &sourceUrl, const KUrl &destUrl ) { KJob *job = 0; if( m_transcodingConfig.isJustCopy() ) { if( m_goingToRemoveSources && m_coll && sourceUrl.toLocalFile().startsWith( m_coll.data()->mountPoint() ) ) { // special case for "add orphaned tracks" to either save space and significantly // speed-up the process: debug() << "Moving from" << sourceUrl << "to" << destUrl; job = KIO::file_move( sourceUrl, destUrl, -1, KIO::HideProgressInfo | KIO::Overwrite ); } else { debug() << "Copying from" << sourceUrl << "to" << destUrl; job = KIO::file_copy( sourceUrl, destUrl, -1, KIO::HideProgressInfo | KIO::Overwrite ); } } else { debug() << "Transcoding from" << sourceUrl << "to" << destUrl; job = new Transcoding::Job( sourceUrl, destUrl, m_transcodingConfig ); } job->setUiDelegate( 0 ); // be non-interactive job->setAutoDelete( true ); connect( job, SIGNAL(finished(KJob*)), // we must use this instead of result() to prevent deadlock SLOT(slotCopyOrTranscodeJobFinished()) ); job->start(); // no-op for KIO job, but matters for transcoding job }
void GitRunner::init() { QStringList command; command << "init"; KJob *job = initJob(command); connect(job, SIGNAL(result(KJob*)), this, SLOT(handleInit(KJob*))); job->start(); }
void Task::start() { kDebug(14010) << "Executing children tasks for this task."; KJob *subTask = 0; foreach( subTask, subjobs() ) { subTask->start(); } }
void GitRunner::deleteCommit(const QString &sha1hash) { QStringList command; command << "reset" << "--hard" << sha1hash; KJob *job = initJob(command); connect(job, SIGNAL(result(KJob*)), this, SLOT(handleDeleteCommit(KJob*))); job->start(); }
void KUiServerJobTracker::Private::_k_killJob() { org::kde::JobViewV2 *jobView = qobject_cast<org::kde::JobViewV2*>(q->sender()); if (jobView) { KJob *job = progressJobView.key(jobView); if (job) job->kill(KJob::EmitResult); } }
void PlanExecutor::startRepairJob() { if(mPlan->mBackupType != BackupPlan::BupType || busy() || !destinationAvailable()) { return; } KJob *lJob = new BupRepairJob(*mPlan, mDestinationPath, mLogFilePath, mKupDaemon); connect(lJob, SIGNAL(result(KJob*)), SLOT(repairFinished(KJob*))); lJob->start(); mLastState = mState; mState = REPAIRING; emit stateChanged(); startSleepInhibit(); }
void PlanExecutor::startIntegrityCheck() { if(mPlan->mBackupType != BackupPlan::BupType || busy() || !destinationAvailable()) { return; } KJob *lJob = new BupVerificationJob(*mPlan, mDestinationPath, mLogFilePath, mKupDaemon); connect(lJob, SIGNAL(result(KJob*)), SLOT(integrityCheckFinished(KJob*))); lJob->start(); mLastState = mState; mState = INTEGRITY_TESTING; emit stateChanged(); startSleepInhibit(); }
void XmlValidatorJob::start() { if(m_documentUrl.isEmpty()) { m_result = Failed; m_errors.append(i18n("No document to validate")); setError(m_result); emitResult(); return; } QString localUrl; KJob *copyJob = 0; //DTD inline if(m_dtdUrl.isEmpty() && m_schemaUrl.isEmpty()) { emit signalReady(this); return; } if(!m_dtdUrl.isEmpty()) { localUrl = getLocalURLForSchema(m_documentUrl, m_dtdUrl); if(QFile::exists(localUrl)) { m_dtdUrl = localUrl; emit signalReady(this); return; } else { copyJob = KIO::copy(m_dtdUrl, localUrl, KIO::HideProgressInfo); m_dtdUrl = localUrl; } } if(!m_schemaUrl.isEmpty()) { localUrl = getLocalURLForSchema(m_documentUrl, m_schemaUrl); if(QFile::exists(localUrl)) { m_schemaUrl = localUrl; emit signalReady(this); return; } else { copyJob = KIO::copy(m_schemaUrl, localUrl, KIO::HideProgressInfo); m_schemaUrl = localUrl; } } copyJob->setAutoDelete(true); copyJob->setUiDelegate(0); connect(copyJob, SIGNAL(result(KJob *)), this, SLOT(ready(KJob *))); copyJob->start(); }
ThumbnailLoadJob::~ThumbnailLoadJob() { LOG(this); if (hasSubjobs()) { LOG("Killing subjob"); KJob* job = subjobs().first(); job->kill(); removeSubjob(job); } mThumbnailThread.cancel(); mThumbnailThread.wait(); if (!sThumbnailCache->isRunning()) { sThumbnailCache->start(); } }
void GitRunner::createWorkingCopy(const KUrl &repoOrigin, const KUrl &repoDestination) { // TODO: now supports only cloning a local repo(not very useful, I know =P), // so extend the method to be used over the Internet. m_lastRepoRoot->setDirectory(repoDestination.pathOrUrl()); QStringList command; command << "clone " + repoOrigin.pathOrUrl(); KJob *job = initJob(command); connect(job, SIGNAL(result(KJob*)), this, SLOT(handleCreateWorkingCopy(KJob*))); job->start(); }
void GitRunner::moveToCommit(const QString &sha1hash, const QString &newBranch) { QStringList command; command << "branch" << newBranch << sha1hash; execSynchronously(command); command.clear(); command << "checkout" << newBranch; KJob *job = initJob(command); connect(job, SIGNAL(result(KJob*)), this, SLOT(handleMoveToCommit(KJob*))); job->start(); }
bool Nepomuk2::Indexer::clearIndexingData(const QUrl& url) { kDebug() << "Starting to clear"; KJob* job = Nepomuk2::clearIndexedData( url ); kDebug() << "Done"; job->exec(); if( job->error() ) { m_lastError = job->errorString(); kError() << m_lastError; return false; } return true; }
QUrl SharePlugin::destinationDir() const { const QString defaultDownloadPath = QStandardPaths::writableLocation(QStandardPaths::DownloadLocation); QUrl dir = QUrl::fromLocalFile(config()->get<QString>("incoming_path", defaultDownloadPath)); if (dir.path().contains("%1")) { dir.setPath(dir.path().arg(device()->name())); } KJob* job = KIO::mkpath(dir); bool ret = job->exec(); if (!ret) { qWarning() << "couldn't create" << dir; } return dir; }
void ThumbnailLoadJob::removeItems(const KFileItemList& itemList) { Q_FOREACH(const KFileItem & item, itemList) { // If we are removing the next item, update to be the item after or the // first if we removed the last item mItems.removeAll(item); if (item == mCurrentItem) { // Abort current item mCurrentItem = KFileItem(); if (hasSubjobs()) { KJob* job = subjobs().first(); job->kill(); removeSubjob(job); } } }
void GitRunner::remove(const KUrl::List &files) { if (files.empty()) { return; } QStringList command; command << "rm "; QStringList stringFiles = files.toStringList(); while (!stringFiles.isEmpty()) { command.append(m_lastRepoRoot->pathOrUrl() + '/' + stringFiles.takeAt(0)); } KJob *job = initJob(command); connect(job, SIGNAL(result(KJob*)), this, SLOT(handleRemove(KJob*))); job->start(); }
QString GitRunner::execSynchronously(const QStringList& command) { KJob *job = initJob(command); QString result; if (!job->exec()) { handleError(job); return QString(); } else { DvcsJob *j = qobject_cast<DvcsJob*>(job); if (!j) { return QString(); } result = j->output(); } return result; }
void GitRunner::commit(const QString &message) { // NOTE: git doesn't allow empty commit ! if (message.isEmpty()) { return; } QStringList command; command << "commit"; command << "-m"; //Note: the message is quoted somewhere else command << message; KJob *job = initJob(command); connect(job, SIGNAL(result(KJob*)), this, SLOT(handleCommit(KJob*))); job->start(); }
void KwlImporterJob::run() { if ( !userHasWallets() ) { qDebug() << "No wallets are found into the user directory"; return; } // Now look for existing wallets, check if there is not a collection with the same label, and start the conversion job(s) QDir dir(KGlobal::dirs()->saveLocation("data", "kwallet", false), "*.kwl"); dir.setFilter(QDir::Files | QDir::Hidden); uint amount = 0; foreach(const QFileInfo & fi, dir.entryInfoList()) { KJob *importJob = new ImportSingleWalletJob( fi.absoluteFilePath(), this); if ( !addSubjob( importJob ) ) { qDebug() << "Cannot add import subjob"; } importJob->start(); amount++; } setTotalAmount( Files, amount ); }
void FileObjectEditDialog::typeChanged(int newType) { bool switchToWebsite = true; bool switchToFile = true; QList<QUrl> currentTypes = m_fileObject.types(); if(newType == 0) { // local file switchToWebsite = false; if(currentTypes.contains( NFO::FileDataObject() )) { switchToFile = false; } if(currentTypes.contains( NFO::Website() )) { switchToFile = true; } currentTypes.removeAll(NFO::Website()); currentTypes.removeAll(NFO::WebDataObject()); currentTypes.removeAll(NFO::RemoteDataObject()); if(!currentTypes.contains( NFO::FileDataObject() )) { currentTypes.append( NFO::FileDataObject() ); } } else if(newType == 1) { // remote file switchToWebsite = false; if(currentTypes.contains( NFO::FileDataObject() )) { switchToFile = false; } if(currentTypes.contains( NFO::Website() )) { switchToFile = true; } currentTypes.removeAll(NFO::Website()); currentTypes.removeAll(NFO::WebDataObject()); if(!currentTypes.contains( NFO::FileDataObject() )) { currentTypes.append( NFO::FileDataObject() ); } if(!currentTypes.contains( NFO::RemoteDataObject() )) { currentTypes.append(NFO::RemoteDataObject()); } } else if(newType == 2) { // nfo:website switchToFile = false; if(currentTypes.contains( NFO::FileDataObject() )) { switchToWebsite = true; } if(currentTypes.contains( NFO::Website() )) { switchToWebsite = false; } currentTypes.removeAll(NFO::FileDataObject()); currentTypes.removeAll(NFO::RemoteDataObject()); if(!currentTypes.contains( NFO::WebDataObject() )) { currentTypes.append( NFO::WebDataObject() ); } if(!currentTypes.contains( NFO::Website() )) { currentTypes.append( NFO::Website() ); } } QList<QUrl> publicationUri; publicationUri << m_publication.uri(); QVariantList publicationValue; publicationValue << m_publication.uri(); QList<QUrl> fileObjectUri; fileObjectUri << m_fileObject.uri(); QVariantList fileObjectValue; fileObjectValue << m_fileObject.uri(); QVariantList typeValue; foreach(const QUrl &url, currentTypes) { typeValue << url; } //m_fileObject.setTypes(currentTypes); // this appraoch is not working QList<QUrl> removeAllTypes; removeAllTypes << RDF::type(); KJob *job = Nepomuk2::removeProperties(fileObjectUri, removeAllTypes); if(!job->exec() ) { kDebug() << job->errorString(); } KJob *job2 = Nepomuk2::setProperty(fileObjectUri, RDF::type(), typeValue); if(!job->exec() ) { kDebug() << job2->errorString(); } // change crosslink from nbib:publicationOf / nbib:isPublishedAs to nie:links if( switchToFile ) { Nepomuk2::removeProperty(publicationUri, NIE::links(), fileObjectValue); Nepomuk2::addProperty(publicationUri, NBIB::isPublicationOf(), fileObjectValue); Nepomuk2::addProperty(fileObjectUri, NBIB::publishedAs(), publicationValue); } else if( switchToWebsite ) { Nepomuk2::removeProperty(publicationUri, NBIB::isPublicationOf(), fileObjectValue); Nepomuk2::removeProperty(fileObjectUri, NBIB::publishedAs(), publicationValue); Nepomuk2::addProperty(publicationUri, NIE::links(), fileObjectValue); } }
void SuspendSession::triggerImpl(const QVariantMap &args) { qCDebug(POWERDEVIL) << "Suspend session triggered with" << args; const auto mode = static_cast<Mode>(args["Type"].toUInt()); if (mode == ToRamMode || mode == ToDiskMode || mode == SuspendHybridMode) { // don't suspend if shutting down if (KWorkSpace::isShuttingDown()) { qCDebug(POWERDEVIL) << "Not suspending because a shutdown is in progress"; return; } if (!args["SkipFade"].toBool()) { m_savedArgs = args; m_fadeEffect->start(); return; } } if (args["GraceFade"].toBool()) { return; } // Switch for real action KJob *suspendJob = 0; switch ((Mode) (args["Type"].toUInt())) { case ToRamMode: Q_EMIT aboutToSuspend(); suspendJob = backend()->suspend(PowerDevil::BackendInterface::ToRam); break; case ToDiskMode: Q_EMIT aboutToSuspend(); suspendJob = backend()->suspend(PowerDevil::BackendInterface::ToDisk); break; case SuspendHybridMode: Q_EMIT aboutToSuspend(); suspendJob = backend()->suspend(PowerDevil::BackendInterface::HybridSuspend); break; case ShutdownMode: KWorkSpace::requestShutDown(KWorkSpace::ShutdownConfirmNo, KWorkSpace::ShutdownTypeHalt); break; case LogoutDialogMode: KWorkSpace::requestShutDown(KWorkSpace::ShutdownConfirmYes); break; case LockScreenMode: { // TODO should probably go through the backend (logind perhaps) eventually QDBusConnection::sessionBus().asyncCall(QDBusMessage::createMethodCall("org.freedesktop.ScreenSaver", "/ScreenSaver", "org.freedesktop.ScreenSaver", "Lock")); break; } default: break; } if (suspendJob) { // TODO connect(suspendJob, &KJob::error ??, this, [this]() { m_fadeEffect->stop(); }); suspendJob->start(); } }