initBasti / Amazon2PlentySync (public) (License: GPLv3) (since 2019-01-27) (hash sha1)
Transfer your data from you Amazon Flatfile spreadsheet over to the Plentymarkets system. How to is included in the readme
List of commits:
Subject Hash Author Date (UTC)
fixed a bug with the encoding of very large datasets were the 10000 letters were not enough, increased waiting time but removed some mistakes that way 8f431d6a68fb21699950b1ca48a1592976789c74 LagerBadel PC:Magdalena 2019-06-06 13:41:52
small debugging improvements in writeCSV and missing colors 88db9e1362a4178805671f443554a7f0d3db9e69 LagerBadel PC:Magdalena 2019-06-06 11:52:31
Major Update added a gui category chooser, attribute id connection and updated the whole Script to work on Elastic Sync a8a6156e26e2783a695c87bda35aba725fd77023 Sebastian Fricke 2019-06-05 12:45:29
fixed a bug with the encoding function 0c5b9dd0414037743bf39fdc3420d55035bffa61 Sebastian Fricke 2019-05-14 15:10:17
Major Update added a config file for better useability and added a gui to enter the category and the name of the product further work towards the rework from dynamic import to elastic sync e4356af15a4b8f7393f85bd51c16b330bc3555af Sebastian Fricke 2019-05-14 14:43:03
Changed the price upload to identify items that are not in plentymarkets and added a webshop price 4ab9bcd988f9eb26647748a8f80f25c8c5b7f2e2 Sebastian Fricke 2019-05-03 09:18:35
added Webshop to marketconnections 84f93694fe0c67972ad951649d9f6f0d577d3e29 Sebastian Fricke 2019-05-01 14:12:00
Added the modelnumber feature and removed the creation of empty features ea98391f2dbdf8fb8e601153b4f6ebfca504929c Sebastian Fricke 2019-05-01 12:31:19
Changed the feature upload into a loop for more overview 0a1bee82659a576c6fb4f2641aa3990d8d686b3c Sebastian Fricke 2019-05-01 10:04:20
Added a few new instructions to the Instructions file b4878c59958f89a02937de1dfc7aabbd23e71061 LagerBadel PC:Magdalena 2019-04-18 09:41:10
Made some fields not required but added Warnings for the log file, additionally some new amazon features were added. 6392338b7e9968be3bc4da9031144c3cc2cfae48 Sebastian Fricke 2019-04-18 09:37:51
Added an error log system and improved overall workflow 2e3763e436899466db9f03f70ea926869afd3219 Sebastian Fricke 2019-04-18 08:12:27
Added additional feature uploads 528cad4899d3e3adca5098c1a0ce92c2a6b8a853 Sebastian Fricke 2019-04-16 10:25:49
Added an optimization for the initial directory for Linux 58b340605cba0603520ada8a184cc9fba5f8c3b8 Sebastian Fricke 2019-04-16 10:22:18
Fixed a typo in the build script f7943d8b2c33b89b083380902f1b1281366a12b2 Sebastian Fricke 2019-04-16 08:13:51
Added a build script for Linux + removed the finished executables 8fcf82d5de859895d29a7f355c0d49700beb4e38 Sebastian Fricke 2019-04-16 08:10:13
Changed the EAN type from UPC to GTIN_13 which is the correct one. ea74c1d8c001ae6895f07bbecbcb9a0898400b95 Sebastian Fricke 2019-04-15 13:04:54
fixed a bug with item_name + changed the item_name assignment to include the variation name instead of the parent name 7dedb2bb9afac7d5625ccbf9c05f6ff4b1b1e5e1 LagerBadel PC:Magdalena 2019-04-15 12:32:33
Added usage instructions in english and german language. e2f291e2a00ac9283ab9d843e652d7b77fa6bbaf Sebastian Fricke 2019-04-15 09:59:36
Added usage instructions in english and german language. 30646f203ae8847cfa4971cb62187dca8406b8d7 Sebastian Fricke 2019-04-15 09:58:26
Commit 8f431d6a68fb21699950b1ca48a1592976789c74 - fixed a bug with the encoding of very large datasets were the 10000 letters were not enough, increased waiting time but removed some mistakes that way
Author: LagerBadel PC:Magdalena
Author date (UTC): 2019-06-06 13:41
Committer name: LagerBadel PC:Magdalena
Committer date (UTC): 2019-06-06 13:41
Parent(s): 88db9e1362a4178805671f443554a7f0d3db9e69
Signing key:
Tree: 37b8063b369e5339629885901978bc157de30169
File Lines added Lines deleted
packages/color.py 0 1
packages/item_upload.py 143 135
File packages/color.py changed (mode: 100644) (index 5f354ae..1d11780)
... ... def missingColor(flatfile, attributefile):
12 12 # Get the highest position number # Get the highest position number
13 13 highest_number = int(0) highest_number = int(0)
14 14
15 print('Inside missingcolor: {0}'.format(attributefile))
16 15 with open(attributefile['path'], mode = 'r', encoding=attributefile['encoding']) as item: with open(attributefile['path'], mode = 'r', encoding=attributefile['encoding']) as item:
17 16 reader = csv.DictReader(item, delimiter=';') reader = csv.DictReader(item, delimiter=';')
18 17
File packages/item_upload.py changed (mode: 100644) (index d148a5d..2ae4bad)
... ... def check_encoding(file_dict):
38 38 try: try:
39 39 with open(file_dict['path'], mode='rb') as item: with open(file_dict['path'], mode='rb') as item:
40 40 try: try:
41 raw_data = item.read(10000)
41 raw_data = item.read()
42 42 except Exception as err: except Exception as err:
43 43 print("ERROR: {0}\n".format(err)) print("ERROR: {0}\n".format(err))
44 44 file_dict['encoding'] = chardet.detect(raw_data)['encoding'] file_dict['encoding'] = chardet.detect(raw_data)['encoding']
45 print("chardet data for {0}\n{1}\n".format(file_dict['path'], chardet.detect(raw_data)))
46
45 47 except Exception as err: except Exception as err:
46 48 print("Error : {0}\n".format(err)) print("Error : {0}\n".format(err))
47 49
 
... ... def itemUpload(flatfile, intern, stocklist, attributefile, folder, input_data):
90 92 # PACKAGE PROPERTIES # PACKAGE PROPERTIES
91 93 package_properties = get_properties(flatfile) package_properties = get_properties(flatfile)
92 94
93 # FILL DICTIONARY
94 with open(flatfile['path'], mode='r', encoding=flatfile['encoding']) as item:
95 reader = csv.DictReader(item, delimiter=";")
96
97 for row in reader:
98 # transform the text format to integer in order to adjust the
99 # height, width, length numbers from centimeter to milimeter
100 try:
101 # SET KEYWORDS
102 keywords = ''
103 if(row['generic_keywords']):
104 keywords = row[ 'generic_keywords' ]
105
106 if(not(keywords)):
107 raise barcode.EmptyFieldWarning('generic_keywords')
108
109 # SET ATTRIBUTES
110 attributes = ''
111 if(row['parent_child'] == 'child'):
112 attributes = get_attributes(dataset=row, sets=color_size_sets)
113
114
115 try:
116 values = [
117 '', row['parent_sku'], row['item_sku'],
118 package_properties[ 'length' ] * 10, package_properties[ 'width' ] * 10,
119 package_properties[ 'height' ] * 10, package_properties[ 'weight' ],
120 row['item_name'], '104',
121 attributes,
122 '62', keywords,
123 row['brand_name'].upper(), '3',
124 input_data['name'], row['product_description'],
125 '', 'true', # externalID & active
126 '3', input_data['categories'],
127 input_data['categories'][0:2], input_data['categories'][0:2],
128 'Y', 'Y', # mandant
129 '', '', # barcode
130 'Y', 'Y', # marketconnection
131 'Y', 'Y', # marketconnection
132 'Y', 'Y', # marketconnection
133 '', '', # market & accout id amazonsku
134 '', '', # sku & parentsku amazonsku
135 '', '', '',# producttype & fba amazon
136 '','','','','',# prices
137 '', '', '', #asin
138 '', '', '', '', #image
139 '', '' # image
140 ]
141
142 except KeyError:
143 raise KeyError
144 print('Error at the Values')
145 Data[row['item_sku']] = SortedDict(zip(column_names, values))
146 except KeyError as err:
147 print("Error at : 'if(row['parent_child'] == 'parent'):'")
148 return row['item_sku']
149
150 # open the intern number csv to get the item ID
151 with open(intern['path'], mode='r', encoding=intern['encoding']) as item:
95 try:
96 # FILL DICTIONARY
97 with open(flatfile['path'], mode='r', encoding=flatfile['encoding']) as item:
152 98 reader = csv.DictReader(item, delimiter=";") reader = csv.DictReader(item, delimiter=";")
99
153 100 for row in reader: for row in reader:
101 # transform the text format to integer in order to adjust the
102 # height, width, length numbers from centimeter to milimeter
154 103 try: try:
155 if(row['amazon_sku'] in [*Data]):
156 Data[row['amazon_sku']]['ItemID'] = row['article_id']
157 Data[row['amazon_sku']]['ExternalID'] = row['full_number']
158 except KeyError as keyerr:
159 print(keyerr)
160 print("Keyerror at the Intern Number addition")
161
162 # Include the barcodes & asin
163 barcode_data = barcode.barcode_Upload(flatfile, stocklist)
164
165 for row in barcode_data:
166 try:
167 if(row in [*Data]):
168 Data[row]['EAN_Barcode'] = barcode_data[row]['EAN_Barcode']
169 Data[row]['FNSKU_Barcode'] = barcode_data[row]['FNSKU_Barcode']
170 Data[row]['ASIN-countrycode'] = barcode_data[row]['ASIN-countrycode']
171 Data[row]['ASIN-type'] = barcode_data[row]['ASIN-type']
172 Data[row]['ASIN-value'] = barcode_data[row]['ASIN-value']
173 except Exception as err:
174 print("ERROR @ Barcode Part for {0}.\n{1}.\n".format(row, err))
175
176 # Include the amazonsku
177 sku_data = amazon_data_upload.amazonSkuUpload(flatfile)
178
179 for row in sku_data:
180 try:
181 if(row in [*Data]):
182 Data[row]['marketid'] = sku_data[row]['MarketID']
183 Data[row]['accountid'] = sku_data[row]['MarketAccountID']
184 Data[row]['amazon_sku'] = sku_data[row]['SKU']
185 Data[row]['amazon_parentsku'] = sku_data[row]['ParentSKU']
186 except Exception as err:
187 print("ERROR @ SKU Part for {0}.\n{1}.\n".format(row, err))
188
189 # Include the amazonsku
190 ama_data = amazon_data_upload.amazonDataUpload(flatfile)
191
192 for row in ama_data:
193 try:
194 if(row in [*Data]):
195 Data[row]['amazon-producttype'] = ama_data[row]['ItemAmazonProductType']
196 Data[row]['fba-enabled'] = ama_data[row]['ItemAmazonFBA']
197 Data[row]['fba-shipping'] = ama_data[row]['ItemShippingWithAmazonFBA']
198 except Exception as err:
199 print("ERROR @ AmazonData Part for {0}.\n{1}.\n".format(row, err))
200
201 # Include the price
202 price_data = price_upload.priceUpload(flatfile)
203
204 for row in price_data:
205 try:
206 if(row in [*Data]):
207 Data[row]['price-price'] = price_data[row]['price']
208 Data[row]['ebay-price'] = price_data[row]['ebay']
209 Data[row]['amazon-price'] = price_data[row]['amazon']
210 Data[row]['webshop-price'] = price_data[row]['webshop']
211 Data[row]['etsy-price'] = price_data[row]['etsy']
212 except Exception as err:
213 print("ERROR @ Price Part for {0}.\n{1}.\n".format(row, err))
104 # SET KEYWORDS
105 keywords = ''
106 if(row['generic_keywords']):
107 keywords = row[ 'generic_keywords' ]
108
109 if(not(keywords)):
110 raise barcode.EmptyFieldWarning('generic_keywords')
111
112 # SET ATTRIBUTES
113 attributes = ''
114 if(row['parent_child'] == 'child'):
115 attributes = get_attributes(dataset=row, sets=color_size_sets)
116
117
118 try:
119 values = [
120 '', row['parent_sku'], row['item_sku'],
121 package_properties[ 'length' ] * 10, package_properties[ 'width' ] * 10,
122 package_properties[ 'height' ] * 10, package_properties[ 'weight' ],
123 row['item_name'], '104',
124 attributes,
125 '62', keywords,
126 row['brand_name'].upper(), '3',
127 input_data['name'], row['product_description'],
128 '', 'true', # externalID & active
129 '3', input_data['categories'],
130 input_data['categories'][0:2], input_data['categories'][0:2],
131 'Y', 'Y', # mandant
132 '', '', # barcode
133 'Y', 'Y', # marketconnection
134 'Y', 'Y', # marketconnection
135 'Y', 'Y', # marketconnection
136 '', '', # market & accout id amazonsku
137 '', '', # sku & parentsku amazonsku
138 '', '', '',# producttype & fba amazon
139 '','','','','',# prices
140 '', '', '', #asin
141 '', '', '', '', #image
142 '', '' # image
143 ]
144
145 except KeyError:
146 raise KeyError
147 print('Error at the Values')
148 Data[row['item_sku']] = SortedDict(zip(column_names, values))
149 except KeyError as err:
150 print("Error at : 'if(row['parent_child'] == 'parent'):'")
151 return row['item_sku']
152
153 # open the intern number csv to get the item ID
154 with open(intern['path'], mode='r', encoding=intern['encoding']) as item:
155 reader = csv.DictReader(item, delimiter=";")
156 for row in reader:
157 try:
158 if(row['amazon_sku'] in [*Data]):
159 Data[row['amazon_sku']]['ItemID'] = row['article_id']
160 Data[row['amazon_sku']]['ExternalID'] = row['full_number']
161 except KeyError as keyerr:
162 print(keyerr)
163 print("Keyerror at the Intern Number addition")
164
165 # Include the barcodes & asin
166 barcode_data = barcode.barcode_Upload(flatfile, stocklist)
167
168 for row in barcode_data:
169 try:
170 if(row in [*Data]):
171 Data[row]['EAN_Barcode'] = barcode_data[row]['EAN_Barcode']
172 Data[row]['FNSKU_Barcode'] = barcode_data[row]['FNSKU_Barcode']
173 Data[row]['ASIN-countrycode'] = barcode_data[row]['ASIN-countrycode']
174 Data[row]['ASIN-type'] = barcode_data[row]['ASIN-type']
175 Data[row]['ASIN-value'] = barcode_data[row]['ASIN-value']
176 except Exception as err:
177 print("ERROR @ Barcode Part for {0}.\n{1}.\n".format(row, err))
178
179 # Include the amazonsku
180 sku_data = amazon_data_upload.amazonSkuUpload(flatfile)
181
182 for row in sku_data:
183 try:
184 if(row in [*Data]):
185 Data[row]['marketid'] = sku_data[row]['MarketID']
186 Data[row]['accountid'] = sku_data[row]['MarketAccountID']
187 Data[row]['amazon_sku'] = sku_data[row]['SKU']
188 Data[row]['amazon_parentsku'] = sku_data[row]['ParentSKU']
189 except Exception as err:
190 print("ERROR @ SKU Part for {0}.\n{1}.\n".format(row, err))
191
192 # Include the amazonsku
193 ama_data = amazon_data_upload.amazonDataUpload(flatfile)
194
195 for row in ama_data:
196 try:
197 if(row in [*Data]):
198 Data[row]['amazon-producttype'] = ama_data[row]['ItemAmazonProductType']
199 Data[row]['fba-enabled'] = ama_data[row]['ItemAmazonFBA']
200 Data[row]['fba-shipping'] = ama_data[row]['ItemShippingWithAmazonFBA']
201 except Exception as err:
202 print("ERROR @ AmazonData Part for {0}.\n{1}.\n".format(row, err))
214 203
215 # Include the images
216 image_data = image_upload.imageUpload(flatfile, attributefile)
204 # Include the price
205 price_data = price_upload.priceUpload(flatfile)
217 206
218 for index, row in enumerate( image_data ):
219 try:
220 if(row in [*Data]):
221 Data[row]['Multi-URL'] = image_data[row]['Multi-URL']
222 Data[row]['connect-variation'] = image_data[row]['connect-variation']
223 Data[row]['mandant'] = image_data[row]['mandant']
224 Data[row]['availability'] = image_data[row]['availability']
225 Data[row]['listing'] = image_data[row]['listing']
226 Data[row]['connect-color'] = image_data[row]['connect-color']
227 except Exception as err:
228 print("ERROR @ Image Part for {0}.\n{1}.\n".format(row, err))
207 for row in price_data:
208 try:
209 if(row in [*Data]):
210 Data[row]['price-price'] = price_data[row]['price']
211 Data[row]['ebay-price'] = price_data[row]['ebay']
212 Data[row]['amazon-price'] = price_data[row]['amazon']
213 Data[row]['webshop-price'] = price_data[row]['webshop']
214 Data[row]['etsy-price'] = price_data[row]['etsy']
215 except Exception as err:
216 print("ERROR @ Price Part for {0}.\n{1}.\n".format(row, err))
217
218 # Include the images
219 image_data = image_upload.imageUpload(flatfile, attributefile)
220
221 for index, row in enumerate( image_data ):
222 try:
223 if(row in [*Data]):
224 Data[row]['Multi-URL'] = image_data[row]['Multi-URL']
225 Data[row]['connect-variation'] = image_data[row]['connect-variation']
226 Data[row]['mandant'] = image_data[row]['mandant']
227 Data[row]['availability'] = image_data[row]['availability']
228 Data[row]['listing'] = image_data[row]['listing']
229 Data[row]['connect-color'] = image_data[row]['connect-color']
230 except Exception as err:
231 print("ERROR @ Image Part for {0}.\n{1}.\n".format(row, err))
229 232
230 233
231 # Write Data into new CSV for Upload
232 # OUTPUT
233 # --------------------------------------------------------------
234 # Write Data into new CSV for Upload
235 # OUTPUT
236 # --------------------------------------------------------------
234 237
235 238 barcode.writeCSV(Data, "item", column_names, folder) barcode.writeCSV(Data, "item", column_names, folder)
239 except UnicodeDecodeError as err:
240 print("Decode Error at line: {0}, err: {1}".format(sys.exc_info()[2].tb_lineno, err))
241 print("press ENTER to continue..")
242 input()
243 sys.exit()
236 244
237 245 def itemPropertyUpload(flatfile, folder): def itemPropertyUpload(flatfile, folder):
238 246
Hints:
Before first commit, do not forget to setup your git environment:
git config --global user.name "your_name_here"
git config --global user.email "your@email_here"

Clone this repository using HTTP(S):
git clone https://rocketgit.com/user/initBasti/Amazon2PlentySync

Clone this repository using ssh (do not forget to upload a key first):
git clone ssh://rocketgit@ssh.rocketgit.com/user/initBasti/Amazon2PlentySync

Clone this repository using git:
git clone git://git.rocketgit.com/user/initBasti/Amazon2PlentySync

You are allowed to anonymously push to this repository.
This means that your pushed commits will automatically be transformed into a merge request:
... clone the repository ...
... make some changes and some commits ...
git push origin main