• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Firefox Hacking *Updated*

hevnsnt

Lifer
Open Firefox 1.0 and type about:config in the address bar

1. Find browser.tabs.showSingleWindowModePrefs and double click on it so it = true
2. Find network.http.pipelining and double click on it so it = true
3. Find network.http.pipelining.maxrequests doule click on it and change it from 4 to 100

1. Then enables advanced tab options in your Tools/Options page 🙂))
2. This enables option #3.
3. This makes FF use 8 threads to each page.. Bascially, if you thought FF was fast before, try it after this.

I-Hacked Mirror link for Firefox 1.0 Final Here


Other Contributions:
yankeesfan: to get older extensions to work in 1.0, edit extensions.lastAppVersion from '1.0' to '.10'. Bugmenot and others will now work.

[Updates]
Updated #3 from 8 to 100 per Mozilla developers suggestions.
Added yankeesfan's older extentions hack

---

PSA - This is not the Software - Apps, Programming and Games forum.

AnandTech Moderator
 
Originally posted by: hevnsnt
Open Firefox 1.0 and type about:config in the address bar

1. Find browser.tabs.showSingleWindowModePrefs and double click on it so it = true
2. Find network.http.pipelining and double click on it so it = true
3. Find network.http.pipelining.maxrequests doule click on it and change it from 4 to 8

1. Then enables advanced tab options in your Tools/Options page 🙂))
2. This opens up option #3.
3. This makes FF use 8 threads to each page.. Bascially, if you thought FF was fast before, try it after this.

8 is for noobs. 20 is where it's at 😎
 
Originally posted by: WobbleWobble
What are the limitations of setting network.http.pipelining.maxrequests to a larger number? Higher CPU usage?

I think doing this violates http standards. (not that it really matters.) You are only supposed to have so many persistant connections to a server. The only consequnces of doing this is that if you have a slow connection it will hurt your downloads because you are opening more connections than it can really handle, i.e. you don't have the bandwith to utilize 20 connections to the remote server. It may make the page seem to load slower. However, on a fast connection you end up using more of your availible pipe when loading the page, i.e. if 4 pipelines only uses 20% of your connection then 20 pipelines will use 100% and the page loads faster.

The faster your connection, the higher you can set this number. You will have to do your own tests to see what to set it at.
 
Originally posted by: Kyteland
Originally posted by: WobbleWobble
What are the limitations of setting network.http.pipelining.maxrequests to a larger number? Higher CPU usage?

I think doing this violates http standards. (not that it really matters.) You are only supposed to have so many persistant connections to a server. The only consequnces of doing this is that if you have a slow connection it will hurt your downloads because you are opening more connections than it can really handle, i.e. you don't have the bandwith to utilize 20 connections to the remote server. It may make the page seem to load slower. However, on a fast connection you end up using more of your availible pipe when loading the page, i.e. if 4 pipelines only uses 20% of your connection then 20 pipelines will use 100% and the page loads faster.

The faster your connection, the higher you can set this number. You will have to do your own tests to see what to set it at.

From http://forums.anandtech.com/me...8&threadid=1430877
It doesn't violate http standards.

8 Connections

8.1 Persistent Connections

8.1.1 Purpose

Prior to persistent connections, a separate TCP connection was
established to fetch each URL, increasing the load on HTTP servers
and causing congestion on the Internet. The use of inline images and
other associated data often require a client to make multiple
requests of the same server in a short amount of time.
Analysis of
these performance problems and results from a prototype
implementation are available [26] [30]. Implementation experience and
measurements of actual HTTP/1.1 (RFC 2068) implementations show good
results [39]. Alternatives have also been explored, for example,
T/TCP [27].

Persistent HTTP connections have a number of advantages:

- By opening and closing fewer TCP connections, CPU time is saved
in routers and hosts (clients, servers, proxies, gateways,
tunnels, or caches), and memory used for TCP protocol control
blocks can be saved in hosts.

- HTTP requests and responses can be pipelined on a connection.
Pipelining allows a client to make multiple requests without
waiting for each response, allowing a single TCP connection to
be used much more efficiently, with much lower elapsed time.

- Network congestion is reduced by reducing the number of packets
caused by TCP opens, and by allowing TCP sufficient time to
determine the congestion state of the network.

- Latency on subsequent requests is reduced since there is no time
spent in TCP's connection opening handshake.

- HTTP can evolve more gracefully, since errors can be reported
without the penalty of closing the TCP connection. Clients using
future versions of HTTP might optimistically try a new feature,
but if communicating with an older server, retry with old
semantics after an error is reported.

HTTP implementations SHOULD implement persistent connections.
 
Back
Top