sitespeed.io/CHANGELOG

124 lines
5.8 KiB
Plaintext

CHANGELOG sitespeed.io
version 1.6
------------------------
* The SPOF rule now only report font face that are loaded from another top level domain. Also the actual font file is reported.
* Show requests per domain on individual page.
* Show cache time for each asset & delta between last modified & delivered + average.
* Configure which yslow file to use and which ruleset.
* Show time spent in frontend vs backend per page.
* On the summary page, now show info blocks: time spent in frontend/backend and cachetime/last modified average.
* New page: The assets page, show the most used assets for the site
* Adjusted the warning rules on the summary page, now a warning is up to the average number collected from httparchive (where applicable)
* Java jar dependencies now compiled for java 1.6 and higher
version 1.5.2
------------------------
* Bugfix: The SPOF rule reported CSS & JS as SPOF:s wrongly
version 1.5.1
------------------------
* Bugfix: The crawler reported links returning 200 with another content type then text/html as error url:s
version 1.5
------------------------
* Added support for configuring the crawler (see the dependencies/crawler.properties file).
* Added support for analyze behind proxy (thanks https://github.com/rhulse and https://github.com/samteeeee for reporting and testing it)
* Added html page that shows url:s that returned errors from the crawl
* Added percentage on summary page
* Added support for setting user agent
* Added support for setting view port
* Removed exprimental rule for the amount of JS used
* Added new rule: Critical Path
* Finetuned the SPOF rule: Now also check font face
* Added time to first byte (using curl, new requirement)
* Fixed so document weight is fetched from curl aka the right sized if gzipped
* Bugfix: The check for phantomjs wasn't working, works now
* Bugfix: Now using JAVA_HOME in a correct way (thanks Rob-m)
* Bugfix: Upgraded the crawler to 1.3, now only fetched text/html links
* Removed csv as output format
* New rule: Avoid CDN lookups when your page has few rewquests
* New rule: Do not load stylesheet files when the page has few request
* New rule: Have a reasonable percentage of textual content compared to the rest of the page
version 1.4
------------------------
* Changed the limit value for doc size on the summary page, vas 10 kb but gzip is taken inconcideration, changed to 100 kb!
* Concatenating css & js in the results to one file each
* Show average of how much of a page that concist of javascript in percent, on the summary page
* Show median values where applicable on the summary page (now show both average & median value)
* Show how much of a page that is js & css on a page in percent, compared to content
* Made java heap size & result directory configurable from the sitespeed script
* Cleanup if the sitespeed.io script, removed unused code and made it easier to update
* You can now zip the output result by calling the script
* Upgraded to latest crawler & xml-velocity jar
* Added image, css, js and css image total weight/size on page view
* Added new exprimental rule of javascript percentage
* Upgraded jquery from 1.8.2 to 1.8.3
version 1.3
------------------------
* Made all pages responsive, standard stuff except the table in pages.vm, two different tables, one for phones, one for the rest
* Moved webpagetest link from pages.vm to page, to make more space
* Added the summary data also on page.vm
* Added possibility to output all page data as csv file (yes that old thing)
* Added possibility to output every html file as png, for easy include in documents/web etc
* Added htmlcompressor (http://code.google.com/p/htmlcompressor/) to compress all html, to make the pages smaller
* Upgraded to latest version of the crawler (smaller in size)
* Added support for handling testing only one page (depth = 0), thanks @tomsutton1984
version 1.2
------------------------
* Better handling of input parameters, now you specify them in the order you like
* Possibility to not crawl specifis path segments in urls
* Run multiple processes when analyzing pages (to make it faster)
* More documentation in the sitespeed.io script
* Include rules dictionary when using yslow, always update the doc.js in yslow when adding new rule
* Show the full rule name when showing broken rules
* Show explanation for all rules used, linked from summary page
* Show exact rule number, for easier trackback
version 1.1
------------------------
* New crawler instead of wget that didn't work on some sites with spider options (amazon etc)
* Fix for css in head rule, now only dns lookups are punished, not the number of css
* Crawl by follow a specific path, meaning you can analyze parts of sites
version 1.0.1
------------------------
* Fixed bug that sometimes url:s was fetched from different domains than the main domain
* Added links to tested start url on both summary and page page
* Added parameters to webpagetest run three times by default
* When a SPOF is found, link to webpagetest with SPOF domains activated is used by default
version 1.0
------------------------
* Show full urls in pages & page to easier understand which url that is analyzed
* Show extra data in modals to make it clearer
* Popover & better texts on summary page
* Cleanup & bug fixes in the bash script, it sometimes failed on some sites when yslow outputted content after the xml
* Added ouput png:s that can be used on documents
version 0.9
------------------------
* New rules: Loading js async and finding single point of failure
* Modified expires to skip analythics scripts
* Updated rules texts
version 0.8
------------------------
* Added new custom rules and modified existing yslow rules.
* Favicon added :)
version 0.7
------------------------
* Upgraded to jquery 1.8
* Upgraded Twitter Bootstrap to 2.1
* Better title tag on result pages
* Fixed so that long url:s don't break
* Sometimes output xml was broken
* Only fetch content of type html