1. Initial experience of automation construction
- Yarn init
- New scss/main.scss
- yarn add sass --dev
- Use the command to compile SCSS into CSS. / node_ modules/.bin/sass scss/ main.scss css/ style.css
Every time you compile scss into css, you will return to execute the same command. Here is the need to build package.json Add NPM Scripts to define some script commands related to the project development process, as follows
The command line npm run build or yarn build is OK. In addition, NPM Scripts is the easiest way to realize automatic construction:
-
Download a test server, yarn add Browser Sync -- dev
-
stay package.json Server is configured in
-
"scripts": { "build": "sass scss/main.scss css/style.css", "serve":"browser-sync ." },
-
Add * * - watch * * to update sass files, but it will block when running
-
"scripts": { "build": "sass scss/main.scss css/style.css --watch", "preserve":"yarn build", "serve":"browser-sync ." },
-
npm run start
-
Adding * * - files "css/*.css" * * enables Browser Sync to listen for changes in some files in the project and hot update these files
-
"scripts": { "build":"sass scss/main.scss css/style.css --watch", "serve":"browser-sync . --files \"css/*.css\"", "start":"run-p build serve" },
- npm run start
2. Common automatic construction tools
-
Grunt
Because it is implemented based on temporary files, it has relatively frequent disk operations, resulting in relatively slow construction speed in super large projects
-
Gulp
Because it is implemented based on memory, it is much faster. It can also perform multiple tasks at the same time, which greatly improves the efficiency. Moreover, compared with grunt, its usage mode is more intuitive and understandable, and its ecology is also very perfect. It is the most popular construction system on the market at present
-
FIS
It's a building system launched by Baidu's front-end team. Compared with the first two microkernels, FIS is more like a bundle package. It integrates many typical requirements inside, such as resource loading, modular development, code deployment, performance optimization
3.Grunt
- Basic use of Grunt
-
yarn init
-
yarn add grunt
-
Create entry file gruntfile.js , as follows
//Entry file for Grunt //Used to define some tasks that need to be performed automatically by Grunt //You need to export a function. This function takes a new parameter of grunt and provides some API s that can be used when creating tasks module.exports = grunt => { grunt.registerTask('foo',()=>{ console.log('hello Grunt') }) }
-
yarn grunt foo can be run. Foo is the name of the run
-
grunt.registerTask If the second parameter of is a string, then the description information can be found in yarn grunt --help
-
If it is default, you can directly run yarn grunt to run default, which is generally used to map the concatenation execution of other tasks, as follows
module.exports = grunt => { grunt.registerTask('foo',()=>{ console.log('hello foo') }) grunt.registerTask('bar','Task description',()=>{ console.log('hello bar') }) grunt.registerTask('default',['foo','bar']) }
-
Handle asynchronous operations: 1. The first function cannot be an arrow function, 2. Use done to identify asynchronous tasks
module.exports = grunt => { grunt.registerTask('async-task2',function(){ const done = this.async(); setTimeout(()=>{ console.log('async-task working'); done() //Use done(false) if the tag fails },1000) }) }
-
Mark the operation of failed task: if the midway task returns false, the subsequent task will be terminated
module.exports = grunt => { grunt.registerTask('foo',()=>{ console.log('hello foo') }) grunt.registerTask('bar','Task description',()=>{ console.log('hello bar') }) grunt.registerTask('bad',function(){ console.log('bad working') return false }) grunt.registerTask('default',['foo','bad','bar'])//bar task will not be performed }
-
The configuration method of Grunt yarn grunt configMsg
module.exports = grunt => { grunt.initConfig({ foo:{ bar:123 } }) grunt.registerTask('configMsg',function(){ console.log(grunt.config('foo')) }) }
-
Grunt multitask yarn grunt build
module.exports = grunt => { grunt.initConfig({ foo:{ bar:123 }, build:{ //Configuration options for the options task options:{ foo:'bar' }, css:{ //The configuration of the previous layer will be overwritten options:{ foo:'bar1' }, }, js:'2', } }) grunt.registerMultiTask('build',function(){ console.log(this.options()) console.log(`target:$,data:$`) }) }
- Use of Grunt plug-ins
-
yarn add grunt-contrib-clean
-
yarn grunt clean
module.exports = grunt => { grunt.initConfig({ clean:{ temp:'temp/app.js',//Delete all under temp app.js file // temp:'temp/*.txt', / / delete all txt files under temp // temp:'temp / * *', / / delete all files under temp } }) //Import the tasks in the plug-in through loadNpmTasks. If the tasks need configuration information grunt.loadNpmTasks('grunt-contrib-clean') }
-
yarn add grunt-sass sass --dev
-
yarn grunt sass
const sass = require('sass') module.exports = grunt => { grunt.initConfig({ sass:{ options:{ sourceMap:true, //Generate corresponding main.css.map file implementation:sass }, main:{ files:{ './scss/main.css':'./scss/main.scss' //'output path': 'compile path file' } } } }) //Import the tasks in the plug-in through loadNpmTasks. If the tasks need configuration information grunt.loadNpmTasks('grunt-sass')
-
yarn add grunt-babel @babel/core @babel/preset-env --dev
-
Yarn add load grunt tasks -- dev can automatically load all tasks in the grunt plug-in without going one by one grunt.loadNpmTasks(’…’ )
//Entry file for Grunt //Used to define some tasks that need to be performed automatically by Grunt //You need to export a function. This function takes a new parameter of grunt and provides some API s that can be used when creating tasks const sass = require('sass') const loadGruntTasks = require('load-grunt-tasks') module.exports = grunt => { grunt.initConfig({ sass:{ options:{ sourceMap:true, //Generate corresponding main.css.map file implementation:sass }, main:{ files:{ './dist/css/main.css':'./src/scss/main.scss' //'output path': 'compile path file' } } }, babel:{ options:{ sourceMap:true, //Generate corresponding main.css.map file presets:['@babel/preset-env'], //presets: which features need to be converted }, main:{ files:{ './dist/js/app.js':'./src/js/app.js' } } } }) //Import the tasks in the plug-in through loadNpmTasks. If the tasks need configuration information //grunt.loadNpmTasks('grunt-sass') //Automatically load all tasks in the grunt plug-in loadGruntTasks(grunt) }
-
yarn grunt sass
-
yarn grunt babel
-
yarn add grunt-contrib-watch --dev
-
Configure
//Entry file for Grunt //Used to define some tasks that need to be performed automatically by Grunt //You need to export a function. This function takes a new parameter of grunt and provides some API s that can be used when creating tasks const sass = require('sass') const loadGruntTasks = require('load-grunt-tasks') module.exports = grunt => { grunt.initConfig({ sass:{ options:{ sourceMap:true, //Generate corresponding main.css.map file implementation:sass }, main:{ files:{ './dist/css/main.css':'./src/scss/main.scss' //'output path': 'compile path file' } } }, babel:{ options:{ sourceMap:true, //Generate corresponding main.css.map file presets:['@babel/preset-env'], //presets: which features need to be converted }, main:{ files:{ './dist/js/app.js':'./src/js/app.js' } } }, watch:{ js:{ files:['src/js/*.js'], tasks:['babel'] }, css:{ files:['src/scss/*.scss'],//scss is the new extension of sass tasks:['sass'] } } }) //Import the tasks in the plug-in through loadNpmTasks. If the tasks need configuration information //grunt.loadNpmTasks('grunt-sass') //Automatically load all tasks in the grunt plug-in loadGruntTasks(grunt) //When the watch is running, it only listens for the changes of the files without compiling them once, so it needs to be executed in sequence grunt.registerTask('default',['sass','babel','watch']) }
-
yarn grunt watch
-
yarn grunt
4.Gulp
- Basic use
-
yarn init --yes
-
yarn add gulp --dev
-
newly build gulpfile.js file
//Gulp's entry file //Gulp is a flow based build tool, which introduces the concept of build pipeline and stipulates that each task is asynchronous, and there is no synchronous task exports.foo = done=>{ console.log('Gulp Entry file for') done(); //Identify task completion } exports.default = done=>{ console.log('default Default call') done(); //Identify task completion } //[email protected] Before that, you need to register the task through the task of the gulp module const gulp = require('gulp') gulp.task('bar',done=>{ console.log('bar working') done(); //Identify task completion })
-
yarn gulp foo
-
Combined tasks
//Combined tasks const = require('gulp'); const task1 = done=>{ setTimeout(()=>{ console.log('task1 working~') done(); },1000) } const task2 = done=>{ setTimeout(()=>{ console.log('task2 working~') done(); },1000) } const task3 = done=>{ setTimeout(()=>{ console.log('task3 working~') done(); },1000) } //series serial task (synchronization: deployment task) exports.a = series(task1,task2,task3); //Parallel parallel tasks (asynchronous: compile css and js without interfering with each other) exports.b = parallel(task1,task2,task3);
-
The operation of handling asynchronous process in Gulp
const fs = require('fs') //Asynchronous task exports.callback = done=>{ console.log('callback task~') done() } exports.callback_error = done=>{ console.log('callback task~') done(new Error('task error!')) } exports.promise = done=>{ console.log('promise task~') return Promise.resolve() } exports.promise_error = done=>{ console.log('promise task~') return Promise.reject(new Error('promise error!')) } const timeout = time => { return new Promise((resoleve,reject)=>{ console.log('dasdsdasadsdsa') //1 setTimeout(resoleve('aaa'),time) }) } exports.async = async()=>{ let a = await timeout(1000) console.log(a)//2 console.log('async task~')//3 } exports.stream = (done) => { const readStream = fs.createReadStream('package.json') //Read content const writeStream = fs.createWriteStream('temp.txt') //Generate file readStream.pipe(writeStream) //Injection content return readStream //There is an end event in the stream, so you don't need to do it. It's equivalent to the following code // readStream.on('end',()=>{ // done() // }) }
-
Simulate the original file compression operation
const { Transform } = require('stream'); //Simulate file compression exports.default = ()=>{ //File read stream const read = fs.createReadStream('normalize.css'); //File write stream const write = fs.createWriteStream('normalize.min.css'); //File conversion const transform = new Transform({ transform:(chunk,encodeing,callback) => { //Core transformation process implementation //chunk: read the contents read in the stream (Buffer) console.log(chunk); const input = chunk.toString(); //Regular remove spaces and comments const output = input.replace(/\s+/g,'').replace(/\/\*.+?\*\//g,''); callback(null,output); } }) read .pipe(transform)//conversion .pipe(write)//write in return read; }
-
Gulp operation API
-
yarn add gulp-clean-css --dev
-
yarn add gulp-rename --dev
//Gulp file operation API const { src, dest } = require('gulp'); const cleanCss = require('gulp-clean-css'); const rename = require('gulp-rename'); exports.GulpApi = ()=>{ return src('src/*.css') //src creates a read stream of files .pipe(cleanCss()) //cleanCss package compression of third-party conversion stream .pipe(rename()) //rename third party conversion stream file name .pipe(dest('dist')) //dest write stream } //As can be seen above, the gulp api is much more convenient than the original one
- Gulp automation construction case
Case code address: git clone https://github.com/zce/zce-gulp-demo.git
sass file compilation task:-
The introduction of gulp module uses src,dest to read and create files, and the introduction of gulp sass to compile scss files
-
yarn add gulp --dev
-
yarn add gulp-sass
const { src, dest } = require('gulp');//dest: target location const sass = require('gulp-sass') const style = ()=>{ return src('src/assets/styles/*.scss',) //Base: base path .pipe(sass()) //sass: convert to css file, but_ The first file will think that the auxiliary file will be ignored. string expansion after compilation {} .pipe(dest('dist')) //dist: Publishing files } module.exports = { style }
-
yarn gulp style
-
The introduction of gulp module uses src,dest to read and create files, and the introduction of gulp Babel to compile js files
-
yarn add gulp-babel --dev
-
Yarn add @ Babel / core @ Babel / preset env -- dev (@ Babel / core: Auto compile @ Babel / preset env: compile all features of es6 +)
const { src, dest } = require('gulp');//dest: target location const babel = require('gulp-babel') const script = () => { return src('src/assets/scripts/*.js',) .pipe(babel()) .pipe(dest('dist')) } module.exports = { script }
-
yarn add gulp-swig --dev
const { src, dest } = require('gulp');//dest: target location const swig = require('gulp-swig') //Page template compilation const page = () => { return src('src/*.html',) //**html file of any file under scr .pipe(swig()) //Data: corresponding data in the template .pipe(dest('dist')) } module.exports = { page }
//Combine the above three tasks const = require('gulp'); const compile = parallel(style,script,page); module.exports = { compile }Font and picture file extraction
-
yarn add gulp-imagemin --dev
const { src, dest } = require('gulp');//dest: target location //The module that gulp imagemin relies on internally is completed through c + +. It needs to download binary assembly, which can only be downloaded through gitHub, but the image can't be downloaded. Therefore, some problems may occur and some problems may be slow const imagemin = require('gulp-imagemin') //Extract picture file const image = () => { //html file of any file under scr return src('src/assets/images/**',) //imagemin: image lossless compression, only deleted some metadata information .pipe(imagemin()) .pipe(dest('dist')) } //Extract font file const font = () => { //**File of any file under scr return src('src/assets/fonts/**',) //svg is also used in font files, which can be processed with imagemin. The unsupported compressed files are not affected .pipe(imagemin()) .pipe(dest('dist')) } //Combined tasks const compile = parallel(image,font); module.exports = { compile }
-
yarn add del --dev
const { src, dest, parallel, series, watch } = require('gulp');//dest: target location //Delete task const del = require('del') //Additional copy tasks const extra = () => { return src('public/**',) //**html file of any file under scr .pipe(dest('dist')) } //Clear before each build const clean = () => { return del(['dist']) } module.exports = { clean, extra }
With the complexity of the construction task, more and more plug-ins are used. All plug-ins can be loaded automatically through the gulp load plugins plug-in
-
const { src, dest, parallel, series, watch } = require('gulp');//dest: target location //Autoload plug in const loadPlugins = require('gulp-load-plugins') const plugins = loadPlugins(); // None of the following needs to be introduced // const sass = require('gulp-sass') // const babel = require('gulp-babel') // const swig = require('gulp-swig') // const imagemin = require('gulp-imagemin') const del = require('del') //Extract picture file const image = () => { return src('src/assets/images/**',) //The corresponding plug-in should be used in the way of plugins .pipe(plugins.imagemin()) .pipe(dest('dist')) }
-
yarn add browser-sync --dev
//Development server const bs = require('browser-sync') const serve = ()=>{ bs.init({ //init: initialize configuration notify:false, //Turn off web page prompts about connected services port:2080, //Port number open:true, //Open browser automatically files:'dist/**', //Files for hot update service server:{ baseDir:'dist', //Root directory of web server, code of browser running routes:{ //Third party references '/node_modules':'node_modules' } } }) } module.exports = { serve }
//JS file compilation task const script = () => { return src('src/assets/scripts/*.js',) .pipe(plugins.babel()) .pipe(dest('dist')) .pipe(bs.reload()) } const serve = ()=>{ // watch: listen to the wildcard of file path, and compile to dist automatically watch('src/assets/styles/*.scss',style) watch('src/assets/scripts/*.js',script) watch('src/*.html',page) ... }Build optimization
It mainly considers which tasks are needed and which tasks are not needed in the development environment and production environment. For example, files under images, fonts, and public do not need to do too many operations in the development environment
User EF file import processing and file compressionThe principle of useref is to build corresponding files into corresponding files by building comments, such as bootstrap.css Build to vendor.css After processing, useref will produce some large uncompressed files (html,js,css), Here we need: the gulp htmlmin gulp uglify gulp clean CSS plug-in needs different compression processing for different files, while the divide and conquer requires the gulp if plug-in
- yarn add gulp-useref --dev
- yarn add gulp-htmlmin gulp-uglify gulp-clean-css --dev
- yarn add gulp-if --dev
const useref = () => { return src('dist/*.html',) //Find the file in the dist path, and find the file in the root directory .pipe(plugins.useref({ searchPath :['dist','.']})) //After useref processing, it will produce some relatively large uncompressed files (html,js,css), which need: gulp htmlmin gulp uglify gulp clean CSS plug-in respectively //Different files need to be processed differently. Here, the gulp if plug-in needs to process them separately //Build notes required .pipe(plugins.if(/\.css$/,plugins.cleanCss())) .pipe(plugins.if(/\.js$/,plugins.uglify())) //By default, only a few spaces are deleted in htmlmin. You need to specify collapseWhitespace.html There may be css and JS code in. Here you need to specify minifycss and minfyjs .pipe(plugins.if(/\.html$/,plugins.htmlmin())) //This can't be written in the dist file, because there will be reading and writing at the same time. This will cause problems //.pipe(dest('dist')) .pipe(dest('release')) }Overall construction task planning
//dest: target location const { src, dest, parallel, series, watch } = require('gulp'); //Autoload plug in const loadPlugins = require('gulp-load-plugins') const plugins = loadPlugins(); // You do not need to import the corresponding plug-ins to use plugins // const plugins.sass = require('gulp-sass') // const plugins.babel = require('gulp-babel') // const plugins.swig = require('gulp-swig') // The module that gulp imagemin relies on internally is completed through c + +. It needs to download binary assembly, which can only be downloaded through gitHub, but the image can't be downloaded. Therefore, some problems will occur and some problems will be slow // const plugins.imagemin = require('gulp-imagemin') // Delete task const del = require('del') // const browserSync = require('browser-sync') // const loadPlugins = require('gulp-load-plugins') // const plugins = loadPlugins() // const bs = browserSync.create() //Data required for template string const data = { menus: [ { name: 'Home', icon: 'aperture', link: 'index.html' }, { name: 'Features', link: 'features.html' }, { name: 'About', link: 'about.html' }, { name: 'Contact', link: '#', children: [ { name: 'Twitter', link: 'https://twitter.com/w_zce' }, { name: 'About', link: 'https://weibo.com/zceme' }, { name: 'divider' }, { name: 'About', link: 'https://github.com/zce' } ] } ], pkg: require('./package.json'), date: new Date() } //Clear before each build const clean = () => { return del(['dist','static']) } //sass file compilation task: const style = ()=>{ //Base: base path return src('src/assets/styles/*.scss',) //sass: convert to css file, but_ The first file will think that the auxiliary file will be ignored. string expansion after compilation {} .pipe(plugins.sass()) .pipe(dest('static')) //dist: Publishing files .pipe(bs.reload()) //Push in the browser as a stream. If the file changes, it will trigger a refresh (hot update) } //JS file compilation task const script = () => { return src('src/assets/scripts/*.js',) .pipe(plugins.babel()) .pipe(dest('static')) .pipe(bs.reload()) } //Page template compilation const page = () => { //**html file of any file under scr return src('src/*.html',) //Data: corresponding data in the template .pipe(plugins.swig()) .pipe(dest('static')) .pipe(bs.reload()) } //Extract picture file const image = () => { return src('src/assets/images/**',) //imagemin: image lossless compression, only deleted some metadata information .pipe(plugins.imagemin()) .pipe(dest('dist')) } //Extract font file const font = () => { return src('src/assets/fonts/**',) //svg is also used in font files, which can be processed with imagemin. The unsupported compressed files are not affected .pipe(plugins.imagemin()) .pipe(dest('dist')) } //Additional copy tasks const extra = () => { return src('public/**',) .pipe(dest('dist')) } //Development server const bs = require('browser-sync') const serve = ()=>{ // watch: listen to the wildcard of file path, and compile to dist automatically watch('src/assets/styles/*.scss',style) watch('src/assets/scripts/*.js',script) watch('src/*.html',page) // In order to improve the construction efficiency of the development mode, we do not operate on the files under images, fonts, and public. We do it again when we publish the production, but what if these files are updated? // watch('src/assets/images/**',image) // watch('src/assets/fonts/**',font) // watch('public/**',extra) // The file update problem is solved here. As follows, when the file changes, the bs refresh operation will be triggered. It can also be written to the corresponding operation function through pipe watch(['src/assets/images/**','src/assets/fonts/**','public/**'],bs.reload) // init: initialize configuration bs.init({ //Turn off web page prompts about connected services notify:false, //Port number port:2080, //Open browser automatically open:true, //Hot update service files, add. Pipe in js,html,css related change functions( bs.reload ({ stream:true }) there is no need for violent monitoring, so the files under src are changed //files:'dist/**', server:{ //The root directory of the web server, when the running code cannot find the resources under the relevant path, will be found in turn according to the intention of the configuration in the array baseDir:['static','src','public'], //The mapping of the third-party reference file route is only valid in the development environment. There is no such file in the generation environment //So we need to package these files into the code during the production build. Here, we can use the gulp useref plug-in to help us package these files routes:{ '/node_modules':'node_modules' } } }) } //The principle of useref is to build corresponding files into corresponding files by building comments, as follows: bootstrap.css Build to vendor.css in // <!-- build:css assets/styles/vendor.css --> // <link rel="stylesheet" href="/node_modules/bootstrap/dist/css/bootstrap.css"> // <!-- endbuild --> const useref = () => { return src('static/*.html',) //Find the file in the dist path, and find the file in the root directory .pipe(plugins.useref({ searchPath :['static','.']})) //After useref processing, it will produce some relatively large uncompressed files (html,js,css), which need: gulp htmlmin gulp uglify gulp clean CSS plug-in respectively //Different files need to be processed differently. Here, the gulp if plug-in needs to process them separately //Build notes required .pipe(plugins.if(/\.css$/,plugins.cleanCss())) .pipe(plugins.if(/\.js$/,plugins.uglify())) //By default, only a few spaces are deleted in htmlmin. To achieve compression, you need to specify collapseWhitespace.html There may be css and JS code in. Here you need to specify minifycss and minfyjs .pipe(plugins.if(/\.html$/,plugins.htmlmin())) //This can't be written in the dist file, because there will be reading and writing at the same time. This will cause problems //.pipe(dest('dist')) //With static, you can put dist .pipe(dest('dist')) } //Combining tasks processed in scr const compile = parallel(style,script,page); //Online mode const build = series(clean,parallel(series(compile,useref),image,font,extra)) //Development model const start = series(compile,serve) module.exports = { start, build, useref, compile }5. Basic use of fix
There is no maintenance at present
-
cnpm i fis3 -g
-
fis3 release -d output migrates the root code to output
-
Translation of cnpm I FIS parser node sass g sass
-
Translation of cnpm i fis-parser-babel-6.x -g js file
//Match js,scss,png files and put them under assets, $0 is the meaning of the current file. The main purpose is to improve portability in some projects without separating the front and back ends fis.match('*.',{ release:'/assets/$0' }) //scss files in any directory fis.match('**/*.scss',{ rExt:'.css', parser:fis.plugin('node-sass'), //Compress css optimizer:fis.plugin('clean-css') }) //scss files in any directory fis.match('**/*.js',{ //At present, it is not maintained and only supports babel-6.x parser:fis.plugin('babel-6.x'), //Compress js optimizer:fis.plugin('uglify-js') })