externalRequestQueue
Hierarchy
- RequestProvider
- RequestQueue
Index
Properties
externalassumedHandledCount
externalassumedTotalCount
externalclient
externalclientKey
externalreadonlyconfig
externalid
externalinternalTimeoutMillis
externallog
externaloptionalname
externalrequestLockSecs
externaltimeoutSecs
Methods
externaladdRequest
Parameters
externalrest...args: [requestLike: Source, options?: RequestQueueOperationOptions]
Returns Promise<RequestQueueOperationInfo>
externaladdRequests
Parameters
externalrest...args: [requestsLike: Source[], options?: RequestQueueOperationOptions]
Returns Promise<BatchAddRequestsResult>
externaladdRequestsBatched
Parameters
externalrest...args: [requests: (string | Source)[], options?: AddRequestsBatchedOptions]
Returns Promise<AddRequestsBatchedResult>
externaldrop
Removes the queue either from the Apify Cloud storage or from the local database, depending on the mode of operation.
Returns Promise<void>
externalfetchNextRequest
Returns a next request in the queue to be processed, or
nullif there are no more pending requests.Once you successfully finish processing of the request, you need to call RequestQueue.markRequestHandled to mark the request as handled in the queue. If there was some error in processing the request, call RequestQueue.reclaimRequest instead, so that the queue will give the request to some other consumer in another call to the
fetchNextRequestfunction.Note that the
nullreturn value doesn't mean the queue processing finished, it means there are currently no pending requests. To check whether all requests in queue were finished, use RequestQueue.isFinished instead.Type parameters
- T: Dictionary = Dictionary
Returns Promise<null | Request<T>>
Returns the request object or
nullif there are no more pending requests.
externalgetInfo
Returns an object containing general information about the request queue.
The function returns the same object as the Apify API Client's getQueue function, which in turn calls the Get request queue API endpoint.
Example:
{
id: "WkzbQMuFYuamGv3YF",
name: "my-queue",
userId: "wRsJZtadYvn4mBZmm",
createdAt: new Date("2015-12-12T07:34:14.202Z"),
modifiedAt: new Date("2015-12-13T08:36:13.202Z"),
accessedAt: new Date("2015-12-14T08:36:13.202Z"),
totalRequestCount: 25,
handledRequestCount: 5,
pendingRequestCount: 20,
}Returns Promise<undefined | RequestQueueInfo>
externalgetRequest
Gets the request from the queue specified by ID.
Type parameters
- T: Dictionary = Dictionary
Parameters
externalid: string
ID of the request.
Returns Promise<null | Request<T>>
Returns the request object, or
nullif it was not found.
externalgetTotalCount
Returns an offline approximation of the total number of requests in the queue (i.e. pending + handled).
Survives restarts and actor migrations.
Returns number
externalhandledCount
Returns the number of handled requests.
This function is just a convenient shortcut for:
const { handledRequestCount } = await queue.getInfo();Returns Promise<number>
externalisEmpty
Resolves to
trueif the next call to RequestQueue.fetchNextRequest would returnnull, otherwise it resolves tofalse. Note that even if the queue is empty, there might be some pending requests currently being processed. If you need to ensure that there is no activity in the queue, use RequestQueue.isFinished.Returns Promise<boolean>
externalisFinished
Returns Promise<boolean>
externalmarkRequestHandled
Parameters
externalrest...args: [request: Request<Dictionary>]
Returns Promise<null | RequestQueueOperationInfo>
externalreclaimRequest
Parameters
externalrest...args: [request: Request<Dictionary>, options?: RequestQueueOperationOptions]
Returns Promise<null | RequestQueueOperationInfo>
staticexternalopen
Parameters
externalrest...args: [queueIdOrName?: null | string, options?: StorageManagerOptions]
Returns Promise<RequestQueue>
Represents a queue of URLs to crawl, which is used for deep crawling of websites where you start with several URLs and then recursively follow links to other pages. The data structure supports both breadth-first and depth-first crawling orders.
Each URL is represented using an instance of the Request class. The queue can only contain unique URLs. More precisely, it can only contain Request instances with distinct
uniqueKeyproperties. By default,uniqueKeyis generated from the URL, but it can also be overridden. To add a single URL multiple times to the queue, corresponding Request objects will need to have differentuniqueKeyproperties.Do not instantiate this class directly, use the RequestQueue.open function instead.
RequestQueueis used by BasicCrawler, CheerioCrawler, PuppeteerCrawler and PlaywrightCrawler as a source of URLs to crawl. Unlike RequestList,RequestQueuesupports dynamic adding and removing of requests. On the other hand, the queue is not optimized for operations that add or remove a large number of URLs in a batch.RequestQueuestores its data either on local disk or in the Apify Cloud, depending on whether theAPIFY_LOCAL_STORAGE_DIRorAPIFY_TOKENenvironment variable is set.If the
APIFY_LOCAL_STORAGE_DIRenvironment variable is set, the queue data is stored in that directory in an SQLite database file.If the
APIFY_TOKENenvironment variable is set butAPIFY_LOCAL_STORAGE_DIRis not, the data is stored in the Apify Request Queue cloud storage. Note that you can force usage of the cloud storage also by passing theforceCloudoption to RequestQueue.open function, even if theAPIFY_LOCAL_STORAGE_DIRvariable is set.Example usage: