This repository contains loose-it, a tool to pre-built Polymer metadata. The tool can be integrated in the existing Polymer tooling build line. It relies on a custom version of Polymer that can work a minimized serialized representation of Polymer metadata.
Integration requires direct interaction with the Polymer tooling pipeline using gulp. Example integration with Gulp is shown in custom-build.
Updates to the gulpFile are as follows:
Add the following import:
const looseIt = require('loose-it').PreBuildBindings;
Update the integration with the sources
and dependencies
streams.
// Let's start by getting your source files. These are all the
// files in your `src/` directory, or those that match your
// polymer.json "sources" property if you provided one.
let sourcesStream = polymerProject.sources()
// Similarly, you can get your dependencies seperately and perform
// any dependency-only optimizations here as well.
let dependenciesStream = polymerProject.dependencies();
let buildStream = mergeStream(sourcesStream, dependenciesStream)
// Apply the tool
.pipe(new looseIt(polymerProject.config))
.pipe(sourcesStreamSplitter.split())
.pipe(gulpif(/\.js$/, babili()))
.pipe(gulpif(/\.css$/, cssSlam()))
.pipe(gulpif(/\.html$/, cssSlam()))
.pipe(gulpif(/\.html$/, htmlMinify()))
// Remember, you need to rejoin any split inline code when you're done.
.pipe(sourcesStreamSplitter.rejoin())
.once('data', () => {
console.log('Analyzing build dependencies...');
});
As you can see in the above snippet, instead of processing the streams separately, they have to be pre-emptively merged.
The tool then hooks into this stream directly and analyzes it using the configuration of polymerProject
.
At last, the stream is split to apply modifications such as minification.
- Read in all files.
- Analyze the files with polymer-analyzer
- Based on the analysis, obtain DFS traversal of HTML imports
- All files that were in the stream but not in the traversal, yield back in the stream
- Launch Chrome Headless using Puppeteer
- For all documents of the DFS traversal:
- Execute all scripts in the document
- For all defined elements in the document:
- Define dom-module in the browser
- Obtain metadata (bindings, property-effects) from browser
- Write binding metadata in front of JS ASTNode of element
- Serialize all AST’s in the document back into a file
- Yield potentially modified content from the file back in the stream