0

I am having a hell of a problem here. I'm using ruby on rails: ruby 1.8.7 (2011-12-10 patchlevel 356) rails 2.3.14

I'm trying a simple open with open-uri on the following address:

http://jollymag.net/n/10390-летни-секс-пози-във-водата.html (link is NSFW)

However the resulting file when read produces a weird (broken) string. This was tested on ruby 1.9.3 and rails 3.2.x too.

require 'open-uri'
url = 'http://jollymag.net/n/10390-летни-секс-пози-във-водата.html'
url = URI.encode(url)
file = open(url)
doc = file.collect.to_s # <- the document is broken
document = Nokogiri::HTML.parse(doc,nil,"utf8") 
puts document # <- the document after nokogiri has one line of content

I tried Iconv stuff and others but nothing works. The above code is more or less a minimal isolated case for the exact problem.

I appreciate any help since I'm trying to figure this bug for a couple of days now.

Regards, Yavor

deefour
  • 34,974
  • 7
  • 97
  • 90
Yavor Ivanov
  • 449
  • 3
  • 8

1 Answers1

2

So the problem was a tricky one for me. It appears that some servers return only gzip-ed response. So in order to read you of course have to read it accordingly. I decided to post my whole crawl code so people might find a more complete solutions to such problems. This is part of a bigger class so it refers a lot of the times to self.

Hope it helps!

  SHINSO_HEADERS = {
    'Accept'          => '*/*',
    'Accept-Charset'  => 'utf-8, windows-1251;q=0.7, *;q=0.6',
    'Accept-Encoding' => 'gzip,deflate',
    'Accept-Language' => 'bg-BG, bg;q=0.8, en;q=0.7, *;q=0.6',
    'Connection'      => 'keep-alive',
    'From'            => 'support@xenium.bg',
    'Referer'         => 'http://svejo.net/',
    'User-Agent'      => 'Mozilla/5.0 (compatible; Shinso/1.0;'
  }

  def crawl(url_address)
    self.errors = Array.new
    begin
      begin
        url_address = URI.parse(url_address)
      rescue URI::InvalidURIError
        url_address = URI.decode(url_address)
        url_address = URI.encode(url_address)
        url_address = URI.parse(url_address)
      end
      url_address.normalize!
      stream = ""
      timeout(10) { stream = url_address.open(SHINSO_HEADERS) }
      if stream.size > 0
        url_crawled = URI.parse(stream.base_uri.to_s)
      else
        self.errors << "Server said status 200 OK but document file is zero bytes."
        return
      end
    rescue Exception => exception
      self.errors << exception
      return
    end
    # extract information before html parsing
    self.url_posted       = url_address.to_s
    self.url_parsed       = url_crawled.to_s
    self.url_host         = url_crawled.host
    self.status           = stream.status
    self.content_type     = stream.content_type
    self.content_encoding = stream.content_encoding
    self.charset          = stream.charset
    if    stream.content_encoding.include?('gzip')
      document = Zlib::GzipReader.new(stream).read
    elsif stream.content_encoding.include?('deflate')
      document = Zlib::Deflate.new().deflate(stream).read
    #elsif stream.content_encoding.include?('x-gzip') or
    #elsif stream.content_encoding.include?('compress')
    else
      document = stream.read
    end
    self.charset_guess    = CharGuess.guess(document)
    if not self.charset_guess.blank? or
       not self.charset_guess == 'utf-8' or
       not self.charset_guess == 'utf8'
      document = Iconv.iconv("UTF-8", self.charset_guess , document).to_s
    end
    document = Nokogiri::HTML.parse(document,nil,"utf8")
    document.xpath('//script').remove
    document.xpath('//SCRIPT').remove
    for item in document.xpath('//*[translate(@src, "ABCDEFGHIJKLMNOPQRSTUVWXYZ", "abcdefghijklmnopqrstuvwxyz")]')
      item.set_attribute('src',make_absolute_address(item['src']))
    end
    document = document.to_s.gsub(/<!--(.|\s)*?-->/,'')
    #document = document.to_s.gsub(/\<![ \r\n\t]*(--([^\-]|[\r\n]|-[^\-])*--[ \r\n\t]*)\>/,'')
    self.content = Nokogiri::HTML.parse(document,nil,"utf8")
  end
Yavor Ivanov
  • 449
  • 3
  • 8