Are there any agressive CSS Minification tools?

放肆的年华 提交于 2019-12-02 19:29:52

I don't know any aggressive CSS minification tool, but you could use the following approach:

Setup

  1. Expand your CSS (margin:1px 0 0 0; to margin-top:1px; margin-left:0px;...).
  2. Build a graph G = (V,E) with V as set of vertices and E as set of edges:
    • V consists of the conjunction of the two sets A (unique selectors, eg. div, p > span, #myid) and B (unique properties, eg. display:block;, color:#deadbeef;).
    • E consists of all associations between a selector (in A) and a property (in B).
  3. Use an appropriate weight function c for your elements in b. This could be the number of neighbors of a given element b, or accumulated lenght of properties - accumulated length of selectors. Your choice.

You may notice that by using this approach you'll get a bipartite graph. Now try the following greedy algorithm (pseudo code):

Algorithm

  1. Take an element b in B that has maximum weight and add it to an empty set Z
  2. Check whether another element d in B has same weight
    • if such a d exists check whether it covers the same selectors.
      1. If d covers the same selectors: add d to Z and go to step 2.
      2. if d does not cover the same selectors check the next element with the same weight or go to step 3 if you checked all elements d.
  3. Now Z is a set of properties covering some selectors. Parse this as CSS into a buffer.
  4. Delete all elements in Z and their adjacent edges in G and delete Z itself.
  5. If B is not empty go to step 1.
  6. Your buffer contains a pre-minified CSS code. You can now merge some properties (eg. margin-top:0px;margin-left:1px).

Remarks

Please note that the actual compression depends on your weight function. As it is a greedy algorithm it will likely return a minified CSS, but I believe someone will post a counterexample. Note also that you have to update your weight function after deleting the elements in Z.

Runtime estimate

The algorithm will always terminate and will run in O(|B|^2*|A|) if I'm not mistaken. If you use a heap and sort the properties in each adjacency list (setup time O(|B|*|A|log(|A|))) you'll get O(|B|*|A|* (log(|B|)+log(|A|))).

CSS Tidy works like a champ!

  • colours like "black" or rgb(0,0,0) are converted to #000000 or rather #000 if possible.
  • Some hex-codes are replaced by their colour names if they are shorter.
  • a{property:x;property:y;} becomes a{property:y;}
  • (all duplicate properties are merged) margin:1px 1px 1px 1px; becomes margin:1px;
  • margin:0px; becomes margin:0;
  • a{margin-top:10px; margin-bottom:10px; margin-left:10px; margin-right:10px;} becomes
  • a{margin:10px;}
  • margin:010.0px; becomes margin:10px;
  • all unnecessary whitespace is removed
  • depending on the compression-level all background-properties are merged
  • all comments are removed
  • the last semicolon in every block can be removed
  • missing semicolons are added
  • incorrect newlines in strings are fixed
  • missing units are added
  • bad colors (and color names) are fixed
  • property:value ! important; becomes property:value !important;

Have you seen YUI Compressor?

A project called CSS Tools claims to do this.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!