I need to find all places in a bunch of HTML files, that lie in following structure (CSS):
div.a ul.b
or XPath:
//div[@class="a"]//div[@class="b"]
grep doesn't help me here. Is there a command-line tool that returns all files (and optionally all places therein), that match this criterium? I.e., that returns file names, if the file matches a certain HTML or XML structure.
Try this:
- Install http://www.w3.org/Tools/HTML-XML-utils/.
- Save a web page (call it filename.html).
- Run:
hxnormalize -l 240 -x filename.html | hxselect -s '\n' -c "label.black"
Where "label.black" is the CSS selector that uniquely identifies the name of the HTML element. Write a helper script named cssgrep:
#!/bin/bash
# Ignore errors, write the results to standard output.
hxnormalize -l 240 -x $1 2>/dev/null | hxselect -s '\n' -c "$2"
You can then run:
cssgrep filename.html "label.black"
This will generate the content for all HTML label elements of the class black. See also: https://superuser.com/a/529024/9067
I have built a command line tool with Node JS which does just this. You enter a CSS selector and it will search through all of the HTML files in the directory and tell you which files have matches for that selector.
You will need to install Element Finder, cd into the directory you want to search, and then run:
elfinder -s "div.a ul.b"
For more info please see http://keegan.st/2012/06/03/find-in-files-with-css-selectors/
Per Nat's answer here:
Command-line tools that can be called from shell scripts include:
4xpath - command-line wrapper around Python's 4Suite package
XMLStarlet
xpath - command-line wrapper around Perl's XPath library
来源:https://stackoverflow.com/questions/7334942/is-there-something-like-a-css-selector-or-xpath-grep