···11+## mastofollow
22+33+### why
44+55+If you run your own Mastodon instance, you may also be frustrated by the
66+problem where you gain a new follower, you click through to view their profile,
77+but Mastodon just shows you this:
88+99+
1010+1111+I tell myself I'll do it later, and then I forget, and now I have a growing
1212+list of followers that I haven't followed back because I can't quickly see
1313+their list of recent posts to determine if they're a bot or a weirdo or someone
1414+that only reposts political things.
1515+1616+### what
1717+1818+This is a hacky Ruby script that will:
1919+2020+1. Fetch your list of followers (paginating as necessary)
2121+2. Fetch the RSS feed of each follower and gather their recent statuses
2222+3. Sort all statuses in reverse chronological order
2323+4. Dump out static HTML files of each page of statuses (100 at a time)
2424+5. Spin up a WEBrick to provide a quick interface to view the static files
2525+2626+Once the HTML files are created, you can view them locally and see each status
2727+with its image attachments and user avatars as a single feed.
2828+From there, hopefully you can find some good content and follow some users
2929+back.
3030+3131+### how
3232+3333+ $ git clone https://github.com/jcs/mastofollow
3434+ $ cd mastofollow
3535+ mastofollow$ bundle install
3636+ [...]
3737+ mastofollow$ bundle exec ruby mastofollow.rb https://example.com/@you
3838+ fetching followers page 1...
3939+ fetching followers page 2...
4040+ [...]
4141+ fetching https://.../users/steve.rss [1/...]
4242+ fetching https://.../users/jakob.rss [2/...]
4343+4444+Where the `https://example.com/@you` argument is your canonical Mastodon URL.
4545+4646+After fetching everything, navigate to `http://127.0.0.1:8000/statuses.html` to
4747+view the timeline.
4848+It will look rather basic, like this:
4949+5050+
5151+5252+### but
5353+5454+This program naively assumes that most followers will be using Mastodon
5555+and Mastodon provides an RSS feed at `https://example.com/user.rss`.
5656+It does not do proper WebFinger lookups or ActivityPub parsing.
5757+If particular followers are not using Mastodon or their server does not provide
5858+a `.rss` response, they will be skipped.
5959+If their RSS feed does not provide `pubDate` dates for statuses, they will be
6060+skipped.
6161+6262+The internal state of statuses is written out to `statuses.json` for further
6363+inquiry, but everything is done in memory and each run starts over.
6464+A SQLite backend or something could be added to reduce memory and browse
6565+statuses in something other than static HTML, but this worked enough for me.
6666+Don't run it too often.
+214
mastofollow.rb
···11+#!/usr/bin/env ruby
22+#
33+# Copyright (c) 2024 joshua stein <jcs@jcs.org>
44+#
55+# Permission to use, copy, modify, and distribute this software for any
66+# purpose with or without fee is hereby granted, provided that the above
77+# copyright notice and this permission notice appear in all copies.
88+#
99+# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
1010+# WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
1111+# MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
1212+# ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
1313+# WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
1414+# ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
1515+# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
1616+#
1717+1818+require "json"
1919+require "nokogiri"
2020+require "date"
2121+require "erb"
2222+require "sanitize"
2323+require "webrick"
2424+2525+require "./sponge"
2626+2727+PER_PAGE = 100
2828+2929+ME = ARGV[0]
3030+if !ME.to_s.match(/^https?:\/\/[^\/]+\/@.+/)
3131+ puts "usage: #{$0} https://example.com/@you"
3232+ exit 1
3333+end
3434+3535+def h(str)
3636+ CGI.escapeHTML(str.to_s)
3737+end
3838+3939+def sanitize(html)
4040+ Sanitize.fragment(html, Sanitize::Config::RELAXED)
4141+end
4242+4343+s = Sponge.new
4444+s.timeout = 15
4545+4646+followers = []
4747+page = 1
4848+while true do
4949+ puts "fetching followers page #{page}..."
5050+ js = s.fetch("#{ME}/followers.json?page=#{page}").json
5151+ followers += js["orderedItems"]
5252+5353+ if js["next"]
5454+ page += 1
5555+ else
5656+ break
5757+ end
5858+end
5959+6060+statuses = []
6161+6262+followers.shuffle.each_with_index do |f,x|
6363+ url = "#{f}.rss"
6464+ print "fetching #{url} [#{x + 1}/#{followers.count}]"
6565+6666+ begin
6767+ res = s.fetch(url, :get, nil, nil, { "Accept" => "application/rss+xml" })
6868+6969+ if res.ok?
7070+ puts ""
7171+ else
7272+ puts " (failed #{res.status})"
7373+ next
7474+ end
7575+ rescue Timeout::Error
7676+ puts " (timed out)"
7777+ next
7878+ rescue => e
7979+ puts " (#{e.message})"
8080+ next
8181+ end
8282+8383+ doc = Nokogiri::XML(res.body)
8484+8585+ user = {
8686+ "url" => f,
8787+ }
8888+8989+ if n = doc.xpath("//title")[0]
9090+ user["name"] = n.text
9191+ end
9292+9393+ if a = doc.xpath("//channel/image/url")[0]
9494+ user["avatar"] = a.text
9595+ end
9696+9797+ doc.xpath("//item").each do |i|
9898+ u = i.xpath("link").text
9999+ text = i.xpath("description").text
100100+101101+ if !i.xpath("pubDate").any?
102102+ puts " no pubDate for status #{u}"
103103+ next
104104+ end
105105+106106+ date = DateTime.parse(i.xpath("pubDate").text).to_time.localtime
107107+108108+ status = {
109109+ "user" => user,
110110+ "url" => i.xpath("link").text,
111111+ "date" => DateTime.parse(i.xpath("pubDate").text).to_time.to_i,
112112+ "text" => i.xpath("description").text,
113113+ "attachments" => [],
114114+ }
115115+116116+ begin
117117+ i.xpath("media:content").each do |att|
118118+ status["attachments"].push({
119119+ "url" => att["url"],
120120+ "medium" => att["medium"],
121121+ })
122122+ end
123123+ rescue Nokogiri::XML::XPath::SyntaxError
124124+ end
125125+126126+ statuses.push status
127127+ end
128128+end
129129+130130+File.write("statuses.json", statuses.to_json)
131131+132132+f = nil
133133+page = 1
134134+statuses.sort_by{|s| s["date"] }.reverse.each_with_index do |s,x|
135135+ if f == nil
136136+ f = File.open("statuses#{page == 1 ? "" : page}.html", "w+")
137137+ f.puts <<-END
138138+ <!doctype html>
139139+ <html>
140140+ <head>
141141+ <meta http-equiv="content-type" content="text/html; charset=utf-8" />
142142+ <meta name="referrer" content="never" />
143143+ <link rel="stylesheet" type="text/css" href="style.css" />
144144+ </head>
145145+ <body>
146146+ END
147147+ end
148148+149149+ t = ERB.new <<-END
150150+ <div class="status">
151151+ <div class="date">
152152+ <a href="<%= h(s["url"]) %>" target="_blank">
153153+ <%= Time.at(s["date"]).strftime("%Y-%m-%d %H:%M:%S") %>
154154+ </a>
155155+ </div>
156156+ <div class="avatar">
157157+ <% if s["user"]["avatar"] %>
158158+ <img src="<%= h(s["user"]["avatar"]) %>">
159159+ <% end %>
160160+ </div>
161161+ <div class="title">
162162+ <a href="<%= h(s["user"]["url"]) %>" target=\"_blank\">
163163+ <%= h(s["user"]["name"]) %>
164164+ </a>
165165+ </div>
166166+ <div class="user">
167167+ <a href="<%= h(s["user"]["url"]) %>" target=\"_blank\">
168168+ <%= h(s["user"]["url"]) %>
169169+ </a>
170170+ </div>
171171+ <div class="body">
172172+ <%= sanitize(s["text"]) %>
173173+ </div>
174174+ <% s["attachments"].each do |at| %>
175175+ <div class="attachment">
176176+ <% if at["medium"] == "video" %>
177177+ <a href="<%= h(at["url"]) %>">Video: <%= h(at["url"]) %></a>
178178+ <% else %>
179179+ <img src="<%= h(at["url"]) %>">
180180+ <% end %>
181181+ </div>
182182+ <% end %>
183183+ </div>
184184+ END
185185+ f.write t.result(binding)
186186+187187+ if ((x + 1) % PER_PAGE == 0) || (x == statuses.count - 1)
188188+ t = ERB.new <<-END
189189+ <div class="pages">
190190+ <% (statuses.count / PER_PAGE.to_f).ceil.times do |pp| %>
191191+ <a href="statuses<%= pp == 0 ? "" : pp + 1 %>.html" class="page">
192192+ <%= pp + 1 %>
193193+ </a>
194194+ <% end %>
195195+ </div>
196196+ </body>
197197+ </html>
198198+ END
199199+ f.write t.result(binding)
200200+ f.close
201201+202202+ f = nil
203203+ page += 1
204204+ end
205205+end
206206+207207+puts "", "open the following URL to view statuses:", ""
208208+puts " http://127.0.0.1:8000/statuses.html", ""
209209+210210+server = WEBrick::HTTPServer.new(:Port => 8000, :DocumentRoot => Dir.pwd)
211211+trap("INT") do
212212+ server.shutdown
213213+end
214214+server.start
+561
sponge.rb
···11+#!/usr/bin/env ruby
22+#
33+# Copyright (c) 2012-2024 joshua stein <jcs@jcs.org>
44+#
55+# Permission to use, copy, modify, and distribute this software for any
66+# purpose with or without fee is hereby granted, provided that the above
77+# copyright notice and this permission notice appear in all copies.
88+#
99+# THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
1010+# WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
1111+# MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
1212+# ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
1313+# WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
1414+# ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
1515+# OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
1616+#
1717+1818+require "cgi"
1919+require "uri"
2020+require "net/https"
2121+require "socket"
2222+require "ipaddr"
2323+require "securerandom"
2424+require "stringio"
2525+2626+require "active_support/hash_with_indifferent_access"
2727+2828+class CaseInsensitiveHash < HashWithIndifferentAccess
2929+ def [](key)
3030+ super convert_key(key)
3131+ end
3232+3333+protected
3434+ def convert_key(key)
3535+ key.respond_to?(:downcase) ? key.downcase : key
3636+ end
3737+end
3838+3939+module Net
4040+ class HTTP
4141+ attr_accessor :address, :custom_conn_address, :skip_close
4242+4343+ def start # :yield: http
4444+ if block_given? && !skip_close
4545+ begin
4646+ do_start
4747+ return yield(self)
4848+ ensure
4949+ do_finish
5050+ end
5151+ end
5252+ do_start
5353+ self
5454+ end
5555+5656+ private
5757+ def conn_address
5858+ if self.custom_conn_address.to_s != ""
5959+ self.custom_conn_address
6060+ else
6161+ address
6262+ end
6363+ end
6464+ end
6565+end
6666+6767+class SpongeResponse
6868+ attr_reader :from_uri
6969+7070+ def initialize(net_http_res, from_uri = nil)
7171+ @res = net_http_res
7272+ @from_uri = from_uri
7373+ end
7474+7575+ def inspect
7676+ "<#{self.class} from #{self.from_uri.to_s}: status=#{self.status} " <<
7777+ "body=#{self.body ? self.body.to_s[0, 100] : nil}>"
7878+ end
7979+8080+ def body
8181+ @res.body
8282+ end
8383+8484+ def status
8585+ @res.code.to_i
8686+ end
8787+8888+ def headers
8989+ return @headers if @headers
9090+9191+ @headers = CaseInsensitiveHash.new(@res.to_hash)
9292+ @headers.each do |k,v|
9393+ @headers[k] = v[0]
9494+ end
9595+ @headers
9696+ end
9797+9898+ def json
9999+ @json ||= JSON.parse(@res.body)
100100+ end
101101+102102+ def ok?
103103+ (200 .. 299).include?(status)
104104+ end
105105+106106+ def to_s
107107+ @res.body
108108+ end
109109+end
110110+111111+class Sponge
112112+ MAX_TIME = 60
113113+ MAX_DNS_TIME = 10
114114+ MAX_KEEP_ALIVE_TIME = 30
115115+116116+ @@KEEP_ALIVES = {}
117117+118118+ attr_accessor :debug, :follow_redirection, :use_custom_resolver,
119119+ :keep_alive, :timeout, :use_private_keepalives, :resolve_cache,
120120+ :avoid_badnets, :local_ip, :user_agent
121121+122122+ # rfc3330
123123+ BAD_NETS = [
124124+ "0.0.0.0/8",
125125+ "10.0.0.0/8",
126126+ "127.0.0.0/8",
127127+ "169.254.0.0/16",
128128+ "172.16.0.0/12",
129129+ "192.0.2.0/24",
130130+ "192.88.99.0/24",
131131+ "192.168.0.0/16",
132132+ "198.18.0.0/15",
133133+ "224.0.0.0/4",
134134+ "240.0.0.0/4"
135135+ ]
136136+137137+ # old api
138138+ def self.fetch(uri, headers = {}, limit = 10)
139139+ s = Sponge.new
140140+ s.fetch(uri, "get", nil, nil, headers, {}, limit)
141141+ end
142142+143143+ def initialize
144144+ @cookies = {}
145145+ @follow_redirection = true
146146+ @use_custom_resolver = true
147147+ @keep_alive = false
148148+ @timeout = MAX_TIME
149149+ @use_private_keepalives = false
150150+ @resolve_cache = {}
151151+ @local_ip = nil
152152+ @json = nil
153153+ @user_agent = "sponge/1.0"
154154+155155+ @avoid_badnets = true
156156+ begin
157157+ if defined?(Rails) && Rails.env.development?
158158+ @avoid_badnets = false
159159+ end
160160+ rescue
161161+ end
162162+163163+ @KEEP_ALIVES = {}
164164+ end
165165+166166+ def close_stale_keep_alives
167167+ [ @KEEP_ALIVES, @@KEEP_ALIVES ].each do |ka|
168168+ ka.keys.each do |h|
169169+ if Time.now - ka[h][:last] > MAX_KEEP_ALIVE_TIME
170170+ begin
171171+ ka[h][:obj].finish
172172+ rescue IOError
173173+ end
174174+ ka.delete(h)
175175+ end
176176+ end
177177+ end
178178+ end
179179+180180+ def find_keep_alive_for(host)
181181+ where = @@KEEP_ALIVES
182182+ if self.use_private_keepalives
183183+ where = @KEEP_ALIVES
184184+ end
185185+186186+ if !where[host]
187187+ return nil
188188+ end
189189+190190+ return where[host][:obj]
191191+ end
192192+193193+ def save_keep_alive(host, obj)
194194+ where = @@KEEP_ALIVES
195195+ if self.use_private_keepalives
196196+ where = @KEEP_ALIVES
197197+ end
198198+199199+ if obj == nil
200200+ if where[host]
201201+ begin
202202+ where[host][:obj].finish
203203+ rescue IOError
204204+ end
205205+ where.delete(host)
206206+ end
207207+ else
208208+ where[host] = { :last => Time.now, :obj => obj }
209209+ end
210210+ end
211211+212212+ def set_cookie(from_host, cookie_line)
213213+ cookie = { "domain" => from_host }
214214+215215+ cookie_line.split(/; ?/).each do |chunk|
216216+ pieces = chunk.split("=")
217217+218218+ cookie[pieces[0]] = pieces[1]
219219+ if pieces[0].match(/^(path|domain|httponly)$/i)
220220+ cookie[pieces[0]] = pieces[1]
221221+ else
222222+ cookie["name"] = pieces[0]
223223+ cookie["value"] = pieces[1]
224224+ end
225225+ end
226226+227227+ dputs "setting cookie #{cookie["name"]} on domain #{cookie["domain"]} " +
228228+ "to #{cookie["value"].inspect}"
229229+230230+ if !@cookies[cookie["domain"]]
231231+ @cookies[cookie["domain"]] = {}
232232+ end
233233+234234+ if cookie["value"].to_s == ""
235235+ @cookies[cookie["domain"]][cookie["name"]] ?
236236+ @cookies[cookie["domain"]][cookie["name"]].delete : nil
237237+ else
238238+ @cookies[cookie["domain"]][cookie["name"]] = cookie["value"]
239239+ end
240240+ end
241241+242242+ def cookies(host)
243243+ cooks = @cookies[host] || {}
244244+245245+ # check for domain cookies
246246+ @cookies.keys.each do |dom|
247247+ if dom.length < host.length &&
248248+ dom == host[host.length - dom.length .. host.length - 1]
249249+ dputs "adding domain keys from #{dom}"
250250+ cooks = cooks.merge @cookies[dom]
251251+ end
252252+ end
253253+254254+ if cooks
255255+ return cooks.map{|k,v| "#{k}=#{v};" }.join(" ")
256256+ else
257257+ return ""
258258+ end
259259+ end
260260+261261+ def fetch(uri, method = :get, fields = nil, raw_post_data = nil,
262262+ headers = {}, attachments = {}, limit = 10)
263263+ if limit <= 0
264264+ raise ArgumentError, "HTTP redirection too deep"
265265+ end
266266+267267+ if !uri.is_a?(URI)
268268+ uri = URI.parse(uri)
269269+ end
270270+ host = nil
271271+ ip = nil
272272+ method = method.to_s.downcase.to_sym
273273+ @json = nil
274274+275275+ if self.keep_alive && (host = self.find_keep_alive_for(uri.host))
276276+ dputs "using cached keep-alive connection to #{uri.host}"
277277+ else
278278+ if @use_custom_resolver
279279+ # we'll manually resolve the ip so we can verify it's not local
280280+ tip = nil
281281+ ips = @resolve_cache[uri.host]
282282+ if !ips || !ips.any?
283283+ begin
284284+ Timeout.timeout(MAX_DNS_TIME) do
285285+ ips = [ Addrinfo.ip(uri.host).ip_address ]
286286+287287+ if !ips.any?
288288+ raise
289289+ end
290290+291291+ @resolve_cache[uri.host] = ips
292292+ end
293293+ rescue Timeout::Error
294294+ raise "couldn't resolve #{uri.host} (DNS timeout)"
295295+ rescue SocketError, StandardError => e
296296+ raise "couldn't resolve #{uri.host} (#{e.inspect}) " <<
297297+ "(#{ips.inspect}) {#{tip.inspect})"
298298+ end
299299+ end
300300+301301+ # pick a random one
302302+ tip = ips[rand(ips.length)]
303303+ ip = IPAddr.new(tip)
304304+305305+ if !ip
306306+ raise "couldn't resolve #{uri.host}"
307307+ end
308308+309309+ if @avoid_badnets &&
310310+ BAD_NETS.select{|n| IPAddr.new(n).include?(ip) }.any?
311311+ raise "refusing to talk to IP #{ip.to_s}"
312312+ end
313313+314314+ host = Net::HTTP.new(ip.to_s, uri.port)
315315+316316+ if uri.scheme == "https"
317317+ # openssl needs to know the hostname, so we'll override conn_address
318318+ # to connect to our ip
319319+ host.address = uri.host
320320+ host.custom_conn_address = ip.to_s
321321+ end
322322+ else
323323+ host = Net::HTTP.new(uri.host, uri.port)
324324+ end
325325+326326+ if host.respond_to?(:local_host) && self.local_ip
327327+ host.local_host = self.local_ip
328328+ end
329329+330330+ if self.debug
331331+ host.set_debug_output STDOUT
332332+ end
333333+334334+ if uri.scheme == "https"
335335+ host.use_ssl = true
336336+ host.verify_mode = OpenSSL::SSL::VERIFY_NONE
337337+ end
338338+ end
339339+340340+ # convert post params into query params for get requests
341341+ if method == :get
342342+ if raw_post_data
343343+ uri.query = URI.encode(raw_post_data)
344344+ if !headers["Content-Type"]
345345+ headers["Content-Type"] = "application/x-www-form-urlencoded"
346346+ end
347347+ elsif fields && fields.any?
348348+ uri.query = encode_fields(fields)
349349+ end
350350+ end
351351+352352+ if method != :get
353353+ if raw_post_data && attachments.any?
354354+ raise "can't do raw POST data and attachments"
355355+ end
356356+357357+ if attachments.any?
358358+ boundary = "----------#{SecureRandom.hex}"
359359+360360+ headers["Content-Type"] = "multipart/form-data; boundary=#{boundary}"
361361+362362+ post_data = fields.map{|k,v|
363363+ "--#{boundary}\r\n" +
364364+ "Content-Disposition: form-data; name=\"#{k}\"\r\n" +
365365+ "\r\n" +
366366+ v.to_s +
367367+ "\r\n"
368368+ }.join
369369+370370+ post_data = post_data.force_encoding("binary")
371371+372372+ attachments.each do |k,v|
373373+ if !v.is_a?(Hash)
374374+ raise "attachment #{k} is not a hash"
375375+ elsif !v.include?(:data)
376376+ raise "attachment #{k} has no :data"
377377+ end
378378+379379+ post_data << ("--#{boundary}\r\n" <<
380380+ "Content-Disposition: form-data; name=\"#{k}\"; filename=\"" <<
381381+ "#{v[:filename]}\"\r\n" <<
382382+ "Content-Type: #{v[:content_type]}\r\n" <<
383383+ "\r\n").force_encoding("binary")
384384+385385+ post_data << v[:data].force_encoding("binary")
386386+ post_data << "\r\n".force_encoding("binary")
387387+ end
388388+389389+ post_data << ("--#{boundary}--\r\n").force_encoding("binary")
390390+391391+ post_data = post_data.force_encoding("binary")
392392+ elsif raw_post_data
393393+ post_data = raw_post_data
394394+ if !headers["Content-Type"]
395395+ headers["Content-Type"] = "application/x-www-form-urlencoded"
396396+ end
397397+ elsif fields && fields.any?
398398+ post_data = encode_fields(fields)
399399+ else
400400+ post_data = ""
401401+ end
402402+403403+ headers["Content-Length"] = post_data.bytesize.to_s
404404+ end
405405+406406+ if uri.path.to_s == ""
407407+ uri.path = "/"
408408+ end
409409+410410+ uri.path = uri.path.gsub(/^\/\/+/, "/")
411411+412412+ cooks = cookies(uri.host).to_s
413413+414414+ dputs "fetching #{uri} (#{ip.to_s}) " + (uri.user ? "with http auth " +
415415+ uri.user + "/" + ("*" * uri.password.length) + " " : "") +
416416+ "by #{method} with cookies #{cooks}" +
417417+ (attachments.any? ? " with #{attachments.length} attachment(s)" : "")
418418+419419+ hs = {
420420+ "Host" => uri.host,
421421+ "User-Agent" => self.user_agent,
422422+ }
423423+424424+ if cooks != ""
425425+ hs["Cookie"] = cooks
426426+ end
427427+428428+ headers = hs.merge(headers || {})
429429+430430+ if self.keep_alive
431431+ headers["Connection"] = "keep-alive"
432432+ host.skip_close = true
433433+ end
434434+435435+ if uri.user
436436+ headers["Authorization"] = "Basic " +
437437+ ["#{uri.user}:#{uri.password}"].pack("m").delete("\r\n")
438438+ end
439439+440440+ res = nil
441441+ begin
442442+ path = uri.path
443443+ if uri.query.to_s != ""
444444+ path += "?" + uri.query
445445+ end
446446+447447+ Timeout.timeout(@timeout) do
448448+ req = case method
449449+ when :delete
450450+ Net::HTTP::Delete.new(path, headers)
451451+ when :get
452452+ Net::HTTP::Get.new(path, headers)
453453+ when :head
454454+ Net::HTTP::Head.new(path, headers)
455455+ when :options
456456+ Net::HTTP::Options.new(path, headers)
457457+ when :post
458458+ Net::HTTP::Post.new(path, headers)
459459+ when :put
460460+ Net::HTTP::Put.new(path, headers)
461461+ else
462462+ raise "unsupported method #{method}"
463463+ end
464464+465465+ if post_data
466466+ req.body = post_data
467467+ end
468468+469469+ res = host.request(req)
470470+ end
471471+ rescue EOFError, Errno::EBADF => e
472472+ if self.keep_alive && self.find_keep_alive_for(uri.host)
473473+ # tried to re-use a dead connection, retry again from the start
474474+ self.save_keep_alive(uri.host, nil)
475475+ dputs "got eof using dead keep-alive socket, retrying"
476476+ return fetch(uri, method, fields, raw_post_data, headers, attachments,
477477+ limit - 1)
478478+ else
479479+ raise e
480480+ end
481481+ end
482482+483483+ if res.get_fields("Set-Cookie")
484484+ res.get_fields("Set-Cookie").each do |cook|
485485+ set_cookie(uri.host, cook)
486486+ end
487487+ end
488488+489489+ if self.keep_alive
490490+ self.save_keep_alive(uri.host, host)
491491+ end
492492+493493+ self.close_stale_keep_alives
494494+495495+ case res
496496+ when Net::HTTPRedirection
497497+ if @follow_redirection
498498+ # follow
499499+ newuri = URI.parse(res["location"])
500500+ if newuri.host
501501+ dputs "following redirection to " + res["location"]
502502+ else
503503+ # relative path
504504+ newuri.host = uri.host
505505+ newuri.scheme = uri.scheme
506506+ newuri.port = uri.port
507507+ newuri.path = "/#{newuri.path}"
508508+509509+ dputs "following relative redirection to " + newuri.to_s
510510+ end
511511+512512+ fetch(newuri.to_s, "get", nil, nil, {}, {}, limit - 1)
513513+ else
514514+ dputs "not following redirection (disabled)"
515515+ return SpongeResponse.new(res, uri)
516516+ end
517517+ else
518518+ return SpongeResponse.new(res, uri)
519519+ end
520520+ end
521521+522522+ def get(uri, params = {}, headers = {})
523523+ fetch(uri, :get, params, nil, headers)
524524+ end
525525+526526+ def post(uri, fields, headers = {})
527527+ fetch(uri, :post, fields, nil, headers)
528528+ end
529529+530530+private
531531+ def dputs(string)
532532+ if self.debug
533533+ puts string
534534+ end
535535+ end
536536+537537+ def encode_fields(fields)
538538+ e = []
539539+ fields.each do |k,v|
540540+ if v.is_a?(Hash)
541541+ # :user => { :name => "hi", :age => "1" }
542542+ # becomes
543543+ # user[hame]=hi and user[age]=1
544544+ v.each do |vk,vv|
545545+ e.push "#{CGI.escape("#{k}[#{vk}]")}=#{CGI.escape(vv.to_s)}"
546546+ end
547547+ elsif v.is_a?(Array)
548548+ # :user => [ "one", "two" ]
549549+ # becomes
550550+ # user[]=one and user[]=two
551551+ v.each do |vv|
552552+ e.push "#{CGI.escape("#{k}[]")}=#{CGI.escape(vv.to_s)}"
553553+ end
554554+ else
555555+ e.push "#{CGI.escape(k.to_s)}=#{CGI.escape(v.to_s)}"
556556+ end
557557+ end
558558+559559+ e.join("&")
560560+ end
561561+end