I have a classic asp script to firstly, get all images on a web page then upload the images to an s3 bucket. The first part works fine, but when trying to upload the images to s3 I get the following error:
Property accepts only one-dimensional byte arrays. See code example below:
remoteurl = "https://some-website-with-images/"
Set AWS = Server.CreateObject("InAmazon.S3")
AWS.AccessKey = AWS_ACCESS_KEY
AWS.SecretKey = AWS_SECRET
AWS.Config("Url=http://s3-ap-southeast-2.amazonaws.com")
AWS.Bucket = "bucket-name"
Set http = Server.CreateObject ("MSXML2.XMLHTTP.6.0")
http.Open "GET", remoteurl, False
http.Send
Set re = New RegExp
re.Pattern = " ]*src=[""'][^ >]*(jpg|png)[""']"
re.IgnoreCase = True
re.Global = True
re.Multiline = True
Set oMatches = re.Execute(http.responseText)
If Not oMatches Is Nothing Then
If oMatches.Count > 0 Then
For Each oMatch In oMatches
If Not oMatches(0).SubMatches Is Nothing Then
sBodyText = oMatch.value
sBodyText = replace(sBodyText,"src=""","")
sBodyText = replace(sBodyText,"""","")
''Read in image as binary
binaryImg = url_to_stream(sBodyText)
AWS.objectDataB = binaryImg
''Upload to S3
AWS.createObject(sBodyText)
End If
Next
End If
End If
function url_to_stream(imageurl)
set xml = Server.CreateObject("MSXML2.XMLHTTP.6.0")
xml.Open "GET", imageurl, false
xml.Send
if err.number = 0 then
if xml.readystate = 4 then
if xml.status = 200 then
set oStream = Server.CreateObject("Adodb.Stream")
oStream.type = adTypeBinary
oStream.Open()
oStream.Write(xml.responseBody)
url_to_stream = oStream.read
oStream.close
set oStream = nothing
end if
end if
end if
set xml = Nothing
end function
The error is triggered from the following line:
AWS.objectDataB = binaryImg
I use the AWS.objectDataB method when uploading images from a form without issue, but when I try to read in an image directly from a url it doesn't work. Am I reading in the image incorrectly? How can I read in an image so the InAmazon.S3 object uploads it correctly?
Cheers
CodePudding user response:
I ended up resolving this by saving the image file locally first then uploading it to s3. Seemed to do the trick.
function url_to_stream(imageurl)
set xml = Server.CreateObject("MSXML2.XMLHTTP.6.0")
xml.Open "GET", imageurl, false
xml.Send
if err.number = 0 then
if xml.readystate = 4 then
if xml.status = 200 then
set oStream = Server.CreateObject("Adodb.Stream")
oStream.type = adTypeBinary
oStream.Open()
oStream.Write(xml.responseBody)
''Get file name
aryPath = split(imageurl, "/")
varFileName = aryPath(UBound(aryPath))
varFilePath = varAppPath & "uploads\aws\" & varFileName
''Save file locally
oStream.SaveToFile varFilePath, adSaveCreateOverwrite
''Read saved file
oStream.LoadFromFile varFilePath
url_to_stream = oStream.Read
oStream.Close
set oStream = Nothing
end if
end if
end if
set xml = Nothing
end function
CodePudding user response:
Switching to using the ADODB.Stream
is the correct approach. However, the problem is the stream is being written to but then read instantly afterwards, at which point the stream position is already after the write so nothing is read.
Before reading you need to set the stream Position
back to 0
.
oStream.Position = 0