2

I have a classic asp script to firstly, get all images on a web page then upload the images to an s3 bucket. The first part works fine, but when trying to upload the images to s3 I get the following error:

Property accepts only one-dimensional byte arrays. See code example below:

remoteurl = "https://some-website-with-images/"

Set AWS = Server.CreateObject("InAmazon.S3")
AWS.AccessKey = AWS_ACCESS_KEY
AWS.SecretKey = AWS_SECRET

AWS.Config("Url=http://s3-ap-southeast-2.amazonaws.com")
AWS.Bucket = "bucket-name"

Set http = Server.CreateObject ("MSXML2.XMLHTTP.6.0")
http.Open "GET", remoteurl, False
http.Send

Set re = New RegExp
re.Pattern = " ]*src=[""'][^ >]*(jpg|png)[""']"
re.IgnoreCase = True
re.Global = True
re.Multiline = True
Set oMatches = re.Execute(http.responseText)
If Not oMatches Is Nothing Then
    If oMatches.Count > 0 Then
    For Each oMatch In oMatches
        If Not oMatches(0).SubMatches Is Nothing Then
            sBodyText = oMatch.value
            sBodyText = replace(sBodyText,"src=""","")
            sBodyText = replace(sBodyText,"""","")

            ''Read in image as binary
            binaryImg = url_to_stream(sBodyText)
            AWS.objectDataB = binaryImg

            ''Upload to S3
            AWS.createObject(sBodyText)
        End If
    Next
    End If
End If

function url_to_stream(imageurl)

    set xml = Server.CreateObject("MSXML2.XMLHTTP.6.0")
    xml.Open "GET", imageurl, false
    xml.Send
    if err.number = 0 then
        if xml.readystate = 4 then
            if xml.status = 200 then
                set oStream = Server.CreateObject("Adodb.Stream")
                oStream.type = adTypeBinary
                oStream.Open()
                oStream.Write(xml.responseBody)
                url_to_stream = oStream.read
                oStream.close
                set oStream = nothing
            end if
         end if
    end if
   set xml = Nothing

end function

The error is triggered from the following line:

AWS.objectDataB = binaryImg

I use the AWS.objectDataB method when uploading images from a form without issue, but when I try to read in an image directly from a url it doesn't work. Am I reading in the image incorrectly? How can I read in an image so the InAmazon.S3 object uploads it correctly?

Sunderam Dubey
  • 1
  • 11
  • 20
  • 40
Benzine
  • 472
  • 1
  • 5
  • 19
  • You cannot use `Response.BinaryWrite()` in that way as [the documentation](https://learn.microsoft.com/en-us/previous-versions/iis/6.0-sdk/ms524318%28v%3dvs.90%29) says "This method has no return values.". – user692942 Jun 13 '22 at 06:54
  • Ok thanks. I’ve also used adodb.stream object to read in the image and still get the same error message. Any suggestions? – Benzine Jun 13 '22 at 07:48
  • Suggest you [edit] the question to show the new approach you've taken to see if you've made any further assumptions there. Recommend looking through this question - [Classic ASP amazon s3 rest authorisation](https://stackoverflow.com/a/12520715) – user692942 Jun 13 '22 at 07:54
  • Believe you are using this COM library - [Amazon Integrator V6 ActiveX Edition - ObjectData](https://cdn.nsoftware.com/help/BA6/acx/S3_p_ObjectData.htm). – user692942 Jun 13 '22 at 08:14
  • 1
    I have edited my question to show my new approach and yes, u are correct, I am using the Amazon Integrator V6 ActiveX component to interact with s3. – Benzine Jun 13 '22 at 10:07
  • The issue is you write to the stream but don’t set the `Position` of the stream back to `0` before reading, so you end up reading nothing. – user692942 Jun 14 '22 at 06:16
  • 1
    I always wonder when I see these, are they migrating a server from GoDaddy to Amazon hosting, or are they "borrowing" content from a non-owned site to upload and claim as their own. – easleyfixed Jun 15 '22 at 20:27
  • Actually the opposite. Our site got hacked and they deleted all our website content (images and docs) from our S3 bucket. I'm using this script to get back our images from Google cache. – Benzine Jun 16 '22 at 23:19

2 Answers2

1

I ended up resolving this by saving the image file locally first then uploading it to s3. Seemed to do the trick.

function url_to_stream(imageurl)

  set xml = Server.CreateObject("MSXML2.XMLHTTP.6.0")
  xml.Open "GET", imageurl, false
  xml.Send
  if err.number = 0 then
    if xml.readystate = 4 then
      if xml.status = 200 then
        set oStream = Server.CreateObject("Adodb.Stream")
        oStream.type = adTypeBinary
        oStream.Open()
        oStream.Write(xml.responseBody)

        ''Get file name
        aryPath = split(imageurl, "/")

        varFileName = aryPath(UBound(aryPath))
        varFilePath = varAppPath & "uploads\aws\" & varFileName

        ''Save file locally
        oStream.SaveToFile varFilePath, adSaveCreateOverwrite

        ''Read saved file
        oStream.LoadFromFile varFilePath

        url_to_stream = oStream.Read
        oStream.Close
        set oStream = Nothing
      end if
    end if
  end if
  set xml = Nothing

end function
Benzine
  • 472
  • 1
  • 5
  • 19
  • This will work but is unnecessary, you need to adjust the stream `Position` before reading so you are back at the beginning of the stream. Would be one line `oStream.Position = 0`. – user692942 Jun 14 '22 at 06:18
0

Switching to using the ADODB.Stream is the correct approach. However, the problem is the stream is being written to but then read instantly afterwards, at which point the stream position is already after the write so nothing is read.

Before reading you need to set the stream Position back to 0.

oStream.Position = 0
user692942
  • 16,398
  • 7
  • 76
  • 175