initBasti / Amazon2PlentySync (public) (License: GPLv3) (since 2019-01-27) (hash sha1)
Transfer your data from you Amazon Flatfile spreadsheet over to the Plentymarkets system. How to is included in the readme
List of commits:
Subject Hash Author Date (UTC)
Changed the item upload format to fix errors in the sync, moved active upload to properties because it has to be done seperatly to the creation process 3b466364e3dcdf14b4cef5b8649ec9573c992324 Sebastian Fricke 2019-06-17 14:09:23
Removed the image upload from item upload and added a exportfile together with functions to get the variation id for the image upload, the image upload is now a single process 6349c4a7177345c25aa6d8ecd03740a75fa2520f Sebastian Fricke 2019-06-13 12:58:36
Updated the feature list to the current active list b5d1675bcb56c38a97c928d7800b6a29c2dea116 LagerBadel PC:Magdalena 2019-06-11 12:11:06
fixed a bug with the encoding of very large datasets were the 10000 letters were not enough, increased waiting time but removed some mistakes that way 8f431d6a68fb21699950b1ca48a1592976789c74 LagerBadel PC:Magdalena 2019-06-06 13:41:52
small debugging improvements in writeCSV and missing colors 88db9e1362a4178805671f443554a7f0d3db9e69 LagerBadel PC:Magdalena 2019-06-06 11:52:31
Major Update added a gui category chooser, attribute id connection and updated the whole Script to work on Elastic Sync a8a6156e26e2783a695c87bda35aba725fd77023 Sebastian Fricke 2019-06-05 12:45:29
fixed a bug with the encoding function 0c5b9dd0414037743bf39fdc3420d55035bffa61 Sebastian Fricke 2019-05-14 15:10:17
Major Update added a config file for better useability and added a gui to enter the category and the name of the product further work towards the rework from dynamic import to elastic sync e4356af15a4b8f7393f85bd51c16b330bc3555af Sebastian Fricke 2019-05-14 14:43:03
Changed the price upload to identify items that are not in plentymarkets and added a webshop price 4ab9bcd988f9eb26647748a8f80f25c8c5b7f2e2 Sebastian Fricke 2019-05-03 09:18:35
added Webshop to marketconnections 84f93694fe0c67972ad951649d9f6f0d577d3e29 Sebastian Fricke 2019-05-01 14:12:00
Added the modelnumber feature and removed the creation of empty features ea98391f2dbdf8fb8e601153b4f6ebfca504929c Sebastian Fricke 2019-05-01 12:31:19
Changed the feature upload into a loop for more overview 0a1bee82659a576c6fb4f2641aa3990d8d686b3c Sebastian Fricke 2019-05-01 10:04:20
Added a few new instructions to the Instructions file b4878c59958f89a02937de1dfc7aabbd23e71061 LagerBadel PC:Magdalena 2019-04-18 09:41:10
Made some fields not required but added Warnings for the log file, additionally some new amazon features were added. 6392338b7e9968be3bc4da9031144c3cc2cfae48 Sebastian Fricke 2019-04-18 09:37:51
Added an error log system and improved overall workflow 2e3763e436899466db9f03f70ea926869afd3219 Sebastian Fricke 2019-04-18 08:12:27
Added additional feature uploads 528cad4899d3e3adca5098c1a0ce92c2a6b8a853 Sebastian Fricke 2019-04-16 10:25:49
Added an optimization for the initial directory for Linux 58b340605cba0603520ada8a184cc9fba5f8c3b8 Sebastian Fricke 2019-04-16 10:22:18
Fixed a typo in the build script f7943d8b2c33b89b083380902f1b1281366a12b2 Sebastian Fricke 2019-04-16 08:13:51
Added a build script for Linux + removed the finished executables 8fcf82d5de859895d29a7f355c0d49700beb4e38 Sebastian Fricke 2019-04-16 08:10:13
Changed the EAN type from UPC to GTIN_13 which is the correct one. ea74c1d8c001ae6895f07bbecbcb9a0898400b95 Sebastian Fricke 2019-04-15 13:04:54
Commit 3b466364e3dcdf14b4cef5b8649ec9573c992324 - Changed the item upload format to fix errors in the sync, moved active upload to properties because it has to be done seperatly to the creation process
Author: Sebastian Fricke
Author date (UTC): 2019-06-17 14:09
Committer name: Sebastian Fricke
Committer date (UTC): 2019-06-17 14:09
Parent(s): 6349c4a7177345c25aa6d8ecd03740a75fa2520f
Signing key:
Tree: 490861996960b7859e655fb29bfd7ad1f649f701
File Lines added Lines deleted
packages/gui/__pycache__/category_chooser.cpython-37.pyc 0 0
packages/item_upload.py 52 15
product_import.py 1 0
File packages/gui/__pycache__/category_chooser.cpython-37.pyc changed (mode: 100644) (index 623fdc6..85b21f6)
File packages/item_upload.py changed (mode: 100644) (index fc462e4..59433c1)
... ... import csv
2 2 import sys import sys
3 3 import re import re
4 4 import chardet import chardet
5 import collections
5 6 from os.path import isfile from os.path import isfile
6 7 from sys import exit from sys import exit
7 8 from packages import barcode, amazon_data_upload, price_upload, image_upload from packages import barcode, amazon_data_upload, price_upload, image_upload
 
... ... def get_variationid(exportfile, sku):
66 67 def itemUpload(flatfile, intern, stocklist, attributefile, folder, input_data): def itemUpload(flatfile, intern, stocklist, attributefile, folder, input_data):
67 68 # The column headers for the output file as expected from the # The column headers for the output file as expected from the
68 69 # plentymarkets dataformat # plentymarkets dataformat
69 column_names = ['ItemID', 'Parent-SKU', 'SKU',
70 column_names = ['Parent-SKU', 'SKU',
70 71 'Length', 'Width', 'Length', 'Width',
71 72 'Height', 'Weight', 'Height', 'Weight',
72 73 'Name', 'MainWarehouse', 'Name', 'MainWarehouse',
 
... ... def itemUpload(flatfile, intern, stocklist, attributefile, folder, input_data):
74 75 'ItemOriginCountry', 'ItemTextKeywords', 'ItemOriginCountry', 'ItemTextKeywords',
75 76 'ItemProducer', 'ItemProducerID', 'ItemProducer', 'ItemProducerID',
76 77 'ItemTextName', 'ItemTextDescription', 'ItemTextName', 'ItemTextDescription',
77 'ExternalID', 'VariationActive',
78 'ExternalID',
78 79 'VariationAvailability', 'Category-IDs', 'VariationAvailability', 'Category-IDs',
79 80 'Standard-Category', 'Standard-Category-Webshop', 'Standard-Category', 'Standard-Category-Webshop',
80 81 'Mandant-Active', 'Webshop-Active', 'Mandant-Active', 'Webshop-Active',
81 82 'EAN_Barcode', 'FNSKU_Barcode', 'EAN_Barcode', 'FNSKU_Barcode',
82 83 'market-active-shop', 'market-active-ebay', 'market-active-shop', 'market-active-ebay',
83 84 'market-active-ebayger', 'market-active-amafba', 'market-active-ebayger', 'market-active-amafba',
84 'market-active-amafbager', 'market-active-webapi',
85 'marketid', 'accountid',
85 'market-active-amafbager', 'marketid', 'accountid',
86 86 'amazon_sku', 'amazon_parentsku', 'amazon_sku', 'amazon_parentsku',
87 87 'amazon-producttype', 'fba-enabled', 'fba-shipping', 'amazon-producttype', 'fba-enabled', 'fba-shipping',
88 88 'price-price', 'ebay-price', 'amazon-price', 'price-price', 'ebay-price', 'amazon-price',
 
... ... def itemUpload(flatfile, intern, stocklist, attributefile, folder, input_data):
95 95 # Unpack File and scrap data # Unpack File and scrap data
96 96 # INPUT # INPUT
97 97 # -------------------------------------------------------------- # --------------------------------------------------------------
98 Data = SortedDict()
98 Data = dict()
99 sorted_Data = collections.OrderedDict()
99 100 package_properties = {} package_properties = {}
100 101 barcode_data = {} barcode_data = {}
101 102
 
... ... def itemUpload(flatfile, intern, stocklist, attributefile, folder, input_data):
136 137
137 138 try: try:
138 139 values = [ values = [
139 '', row['parent_sku'], row['item_sku'],
140 row['parent_sku'], row['item_sku'],
140 141 package_properties[ 'length' ] * 10, package_properties[ 'width' ] * 10, package_properties[ 'length' ] * 10, package_properties[ 'width' ] * 10,
141 142 package_properties[ 'height' ] * 10, package_properties[ 'weight' ], package_properties[ 'height' ] * 10, package_properties[ 'weight' ],
142 143 row['item_name'], '104', row['item_name'], '104',
 
... ... def itemUpload(flatfile, intern, stocklist, attributefile, folder, input_data):
144 145 '62', keywords, '62', keywords,
145 146 row['brand_name'].upper(), '3', row['brand_name'].upper(), '3',
146 147 input_data['name'], row['product_description'], input_data['name'], row['product_description'],
147 '', 'true', # externalID & active
148 '', # externalID
148 149 '3', input_data['categories'], '3', input_data['categories'],
149 150 input_data['categories'][0:2], input_data['categories'][0:2], input_data['categories'][0:2], input_data['categories'][0:2],
150 151 'Y', 'Y', # mandant 'Y', 'Y', # mandant
151 152 '', '', # barcode '', '', # barcode
152 153 'Y', 'Y', # marketconnection 'Y', 'Y', # marketconnection
153 154 'Y', 'Y', # marketconnection 'Y', 'Y', # marketconnection
154 'Y', 'Y', # marketconnection
155 'Y', # marketconnection
155 156 '', '', # market & accout id amazonsku '', '', # market & accout id amazonsku
156 157 '', '', # sku & parentsku amazonsku '', '', # sku & parentsku amazonsku
157 158 '', '', '',# producttype & fba amazon '', '', '',# producttype & fba amazon
158 159 '','','','','','',# prices '','','','','','',# prices
159 160 '', '', '', #asin '', '', '', #asin
160 item_flag
161 item_flag # item flag 1 the sign of the item status
161 162 ] ]
162 163
163 164 except KeyError: except KeyError:
 
... ... def itemUpload(flatfile, intern, stocklist, attributefile, folder, input_data):
233 234 except Exception as err: except Exception as err:
234 235 print("ERROR @ Price Part for {0}.\n{1}.\n".format(row, err)) print("ERROR @ Price Part for {0}.\n{1}.\n".format(row, err))
235 236
236 # Write Data into new CSV for Upload
237 # OUTPUT
238 # --------------------------------------------------------------
237 # Write Data into new CSV for Upload
238 # OUTPUT
239 # --------------------------------------------------------------
240
241 # Sort the dictionary to make sure that the parents are the first variant of each item
242 print("Sort Products")
243 sorted_Data = sort_Products(Data)
239 244
240 barcode.writeCSV(Data, "item", column_names, folder)
245 barcode.writeCSV(sorted_Data, "item", column_names, folder)
241 246 except UnicodeDecodeError as err: except UnicodeDecodeError as err:
242 247 print("Decode Error at line: {0}, err: {1}".format(sys.exc_info()[2].tb_lineno, err)) print("Decode Error at line: {0}, err: {1}".format(sys.exc_info()[2].tb_lineno, err))
243 248 print("press ENTER to continue..") print("press ENTER to continue..")
 
... ... def itemPropertyUpload(flatfile, folder):
303 308
304 309 properties[row['item_sku']] = dict(zip(property_names, values)) properties[row['item_sku']] = dict(zip(property_names, values))
305 310
306 column_names = ['SKU', 'ID-property', 'Value', 'Lang']
311 column_names = ['SKU', 'ID-property', 'Value', 'Lang', 'Active']
307 312 Data = {} Data = {}
308 313 for index, row in enumerate( properties ): for index, row in enumerate( properties ):
309 314 for prop in property_id: for prop in property_id:
 
... ... def get_properties(flatfile):
318 323 properties = {'length':0, properties = {'length':0,
319 324 'width':0, 'width':0,
320 325 'height':0, 'height':0,
321 'weight':0}
326 'weight':0,
327 '1'}
322 328
323 329 with open(flatfile['path'], mode='r', encoding=flatfile['encoding']) as item: with open(flatfile['path'], mode='r', encoding=flatfile['encoding']) as item:
324 330 reader = csv.DictReader(item, delimiter=";") reader = csv.DictReader(item, delimiter=";")
 
... ... def find_similar_attr(flatfile):
389 395 Data[row['parent_sku']]['size'].add(row['size_name']) Data[row['parent_sku']]['size'].add(row['size_name'])
390 396
391 397 return Data return Data
398
399 def sort_Products(dataset):
400 item_list = dataset.items()
401 new_dict = collections.OrderedDict()
402 parent_dict = collections.OrderedDict()
403 child_dict = collections.OrderedDict()
404 position_of_parent = 0
405
406 # Go through the items of the dataset
407 for item in item_list:
408 # When there is no entry in 'Parent-SKU' the item has to be a parent
409 if(not(item[0] in [* new_dict ])):
410 if(not(item[1]['Parent-SKU'])):
411 # add the parent to the new dict
412 new_dict[item[0]] = item[1]
413 # get all the children and update the itemlist without them
414 child_dict = search_child(item_list, item[0])
415 # add each child to the new dict after the parent
416 for child in child_dict:
417 new_dict[child] = child_dict[child]
418
419 return new_dict
420
421 def search_child(item_list, parent):
422 child_dict = collections.OrderedDict()
423
424 for item in item_list:
425 if(item[1]['Parent-SKU'] == parent):
426 child_dict[item[0]] = item[1]
427
428 return child_dict
File product_import.py changed (mode: 100644) (index e4fe0f4..f8f82cb)
... ... def main():
166 166
167 167 if(user_data): if(user_data):
168 168 # Check if there is already a log folder within the upload folder # Check if there is already a log folder within the upload folder
169 print(upload_folder)
169 170 if( not(os.path.exists(os.path.join(upload_folder, 'log'))) ): if( not(os.path.exists(os.path.join(upload_folder, 'log'))) ):
170 171 log_folder = os.path.join(upload_folder, 'log') log_folder = os.path.join(upload_folder, 'log')
171 172 os.makedirs(log_folder) os.makedirs(log_folder)
Hints:
Before first commit, do not forget to setup your git environment:
git config --global user.name "your_name_here"
git config --global user.email "your@email_here"

Clone this repository using HTTP(S):
git clone https://rocketgit.com/user/initBasti/Amazon2PlentySync

Clone this repository using ssh (do not forget to upload a key first):
git clone ssh://rocketgit@ssh.rocketgit.com/user/initBasti/Amazon2PlentySync

Clone this repository using git:
git clone git://git.rocketgit.com/user/initBasti/Amazon2PlentySync

You are allowed to anonymously push to this repository.
This means that your pushed commits will automatically be transformed into a merge request:
... clone the repository ...
... make some changes and some commits ...
git push origin main