Skip to content

Commit

Permalink
deploy: f8e6a61
Browse files Browse the repository at this point in the history
  • Loading branch information
j-mendez committed Nov 28, 2023
1 parent 3eaa3fd commit a1c2f88
Show file tree
Hide file tree
Showing 4 changed files with 38 additions and 2 deletions.
18 changes: 18 additions & 0 deletions nodejs.html
Original file line number Diff line number Diff line change
Expand Up @@ -179,6 +179,9 @@ <h2 id="installation"><a class="header" href="#installation">Installation</a></h
<li><code>npm i @spider-rs/spider-rs --save</code></li>
</ol>
<h2 id="usage"><a class="header" href="#usage">Usage</a></h2>
<p>The examples below can help get started with spider.</p>
<h3 id="basic"><a class="header" href="#basic">Basic</a></h3>
<p>A basic example.</p>
<pre><code class="language-ts">import { Website } from &quot;@spider-rs/spider-rs&quot;;

const website = new Website(&quot;https://rsseau.fr&quot;)
Expand All @@ -194,6 +197,21 @@ <h2 id="usage"><a class="header" href="#usage">Usage</a></h2>

await website.crawl();
console.log(website.getLinks());
</code></pre>
<h3 id="events"><a class="header" href="#events">Events</a></h3>
<p>You can pass a function that could be async as param to <code>crawl</code> and <code>scrape</code>.</p>
<pre><code class="language-ts">import { Website, type NPage } from &quot;@spider-rs/spider-rs&quot;;

const website = new Website(&quot;https://rsseau.fr&quot;);

const links: NPage[] = [];

const onPageEvent = (err: Error | null, value: NPage) =&gt; {
links.push(value);
};

await website.crawl(onPageEvent);
console.log(website.getLinks());
</code></pre>

</main>
Expand Down
18 changes: 18 additions & 0 deletions print.html
Original file line number Diff line number Diff line change
Expand Up @@ -192,6 +192,9 @@ <h2 id="installation"><a class="header" href="#installation">Installation</a></h
<li><code>npm i @spider-rs/spider-rs --save</code></li>
</ol>
<h2 id="usage"><a class="header" href="#usage">Usage</a></h2>
<p>The examples below can help get started with spider.</p>
<h3 id="basic"><a class="header" href="#basic">Basic</a></h3>
<p>A basic example.</p>
<pre><code class="language-ts">import { Website } from &quot;@spider-rs/spider-rs&quot;;

const website = new Website(&quot;https://rsseau.fr&quot;)
Expand All @@ -207,6 +210,21 @@ <h2 id="usage"><a class="header" href="#usage">Usage</a></h2>

await website.crawl();
console.log(website.getLinks());
</code></pre>
<h3 id="events"><a class="header" href="#events">Events</a></h3>
<p>You can pass a function that could be async as param to <code>crawl</code> and <code>scrape</code>.</p>
<pre><code class="language-ts">import { Website, type NPage } from &quot;@spider-rs/spider-rs&quot;;

const website = new Website(&quot;https://rsseau.fr&quot;);

const links: NPage[] = [];

const onPageEvent = (err: Error | null, value: NPage) =&gt; {
links.push(value);
};

await website.crawl(onPageEvent);
console.log(website.getLinks());
</code></pre>

</main>
Expand Down
Loading

0 comments on commit a1c2f88

Please sign in to comment.