initBasti / Amazon2PlentySync (public) (License: GPLv3) (since 2019-01-27) (hash sha1)
Transfer your data from you Amazon Flatfile spreadsheet over to the Plentymarkets system. How to is included in the readme
List of commits:
Subject Hash Author Date (UTC)
Update to property Upload because it isn't necessary to have a property for each material 37c01ce472dc85a9f7a39bd164a2ea53c28b4955 Sebastian Fricke 2019-03-18 16:08:14
added a filter for items with only 1 size that currently works with single parent child combinations fd3bf2b659614d5518884eb3da77b564cd0018eb Sebastian Fricke 2019-02-28 16:08:57
Market connection adjusted to AmazonFBA 1feb4b2e96c6a55ad0494696fd18fd6fb42babb0 Sebastian Fricke 2019-02-25 12:21:05
current version Feb 2019 00b24836dd378f21942ed323c2b66f928b9fb4c4 Sebastian Fricke 2019-02-25 09:00:00
Changes to fit to the new flatfile format 91cd339571f607e88f6e922f1a47630c4c8d62a7 Sebastian Fricke 2019-02-08 13:28:02
Small removal of redundant code b271de0b1be1d83be088b00a35b5618af088b58a Sebastian Fricke 2019-01-30 18:08:15
General improvements and property upload bb48084db4359210eb892a04f1322f6fda822bef Sebastian Fricke 2019-01-30 17:43:32
Fixed scripts according to dataformat changes + readme dec28d9e6ff5c5c903d5ca01a969e661d43b66c6 Sebastian Fricke 2019-01-29 21:08:04
Working Checkboxes and file import 25378c68a6220c1c6570642920e6150a50415153 Sebastian Fricke 2019-01-29 21:03:23
Added checkboxes, descriptions, import and runbutton 2021f0960e70c8c229ec08488165dc01b998a6e0 Sebastian Fricke 2019-01-27 22:19:18
Added market connection, cosmetics in product import c9a771d5e7a3a80adc650e773c568e00dd8e2aea Sebastian Fricke 2019-01-23 15:01:47
Amazon Data Upload 33dbd0ed6945c01d8917ceae3cf3964f051a2288 Sebastian Fricke 2019-01-22 14:43:39
Readme started, amazon sku upload, vari upload, images f43a9e83598c3e4623bcb08667e2b4e649b2cdea Sebastian Fricke 2019-01-22 10:44:40
Amazon SKU Upload 8586da2ae91d49c81a0d9b6ff220c8a1b1b011a6 Sebastian Fricke 2019-01-16 18:36:54
Inital Commit with current working version of the CLI Tool and the work in progress of the GUI. 207fef4277f7c169aa79eb39ec1aaaab258b888c Sebastian Fricke 2019-01-16 09:47:43
Initial commit ba965ee75fe09437fb08da5edd25b20e39e17eff Sebastian Fricke 2019-01-16 09:42:30
Commit 37c01ce472dc85a9f7a39bd164a2ea53c28b4955 - Update to property Upload because it isn't necessary to have a property for each material
Author: Sebastian Fricke
Author date (UTC): 2019-03-18 16:08
Committer name: Sebastian Fricke
Committer date (UTC): 2019-03-18 16:08
Parent(s): fd3bf2b659614d5518884eb3da77b564cd0018eb
Signing key:
Tree: a58696281649678de2ab78653976f118d10816a2
File Lines added Lines deleted
packages/amazon_data_upload.py 29 16
packages/item_upload.py 5 13
File packages/amazon_data_upload.py changed (mode: 100644) (index 3e4bd0d..0e82e4b)
... ... def amazonSkuUpload(flatfile, export):
20 20 item_number = 1 item_number = 1
21 21 for row in reader: for row in reader:
22 22 if(row['VariationID']): if(row['VariationID']):
23 values = [row['VariationID'], '4', '0', '', '']
23 values = [row['VariationID'], '104', '0', '', '']
24 24 Data[row['VariationNumber']] = SortedDict( Data[row['VariationNumber']] = SortedDict(
25 25 zip(column_names, values)) zip(column_names, values))
26 26
 
... ... def amazonSkuUpload(flatfile, export):
36 36
37 37 def amazonDataUpload(flatfile, export): def amazonDataUpload(flatfile, export):
38 38
39 column_names = ['ItemAmazonProductType', 'ItemAmazonFBA', 'bullet_point1',
40 'bullet_point2', 'bullet_point3', 'bullet_point4',
41 'bullet_point5', 'fit_type',
42 'lifestyle', 'batteries_required',
43 'supplier_declared_dg_hz_regulation1',
44 'supplier_declared_dg_hz_regulation2',
45 'supplier_declared_dg_hz_regulation3',
46 'supplier_declared_dg_hz_regulation4',
47 'supplier_declared_dg_hz_regulation5', 'ItemID',
48 'ItemShippingWithAmazonFBA']
39 column_names = [
40 'ItemAmazonProductType', 'ItemAmazonFBA',
41 'bullet_point1','bullet_point2', 'bullet_point3',
42 'bullet_point4', 'bullet_point5',
43 'fit_type', 'lifestyle', 'batteries_required',
44 'supplier_declared_dg_hz_regulation1',
45 'department_name', 'variation_theme', 'collection_name',
46 'material_composition', 'size_map', 'size_name',
47 'color_map', 'ItemID','ItemShippingWithAmazonFBA'
48 ]
49 49
50 50 Data = SortedDict() Data = SortedDict()
51 51
 
... ... def amazonDataUpload(flatfile, export):
77 77 row['bullet_point5'], row['fit_type'], row['bullet_point5'], row['fit_type'],
78 78 row['lifestyle'], row['batteries_required'], row['lifestyle'], row['batteries_required'],
79 79 row['supplier_declared_dg_hz_regulation1'], row['supplier_declared_dg_hz_regulation1'],
80 row['supplier_declared_dg_hz_regulation2'],
81 row['supplier_declared_dg_hz_regulation3'],
82 row['supplier_declared_dg_hz_regulation4'],
83 row['supplier_declared_dg_hz_regulation5'],
84 '','1']
80 row['department_name'],
81 row['variation_theme'],
82 row['collection_name'],
83 row['material_composition'],
84 row['size_map'],
85 row['size_name'],
86 row['color_map'],
87 '0','1']
88
85 89 Data[row['item_sku']] = SortedDict(zip(column_names, values)) Data[row['item_sku']] = SortedDict(zip(column_names, values))
86 90
91 if(row['parent_child'] == 'child' and row['parent_sku'] in [*Data]):
92 for key in column_names:
93 if(not(Data[ row[ 'parent_sku' ] ][ key ])):
94 try:
95 Data[ row[ 'parent_sku' ] ][ key ] = row[ key ]
96 except Exception as err:
97 print(err)
98
99
87 100 with open(export, mode='r') as item: with open(export, mode='r') as item:
88 101 reader = csv.DictReader(item, delimiter=";") reader = csv.DictReader(item, delimiter=";")
89 102
File packages/item_upload.py changed (mode: 100644) (index 7130bff..09aff3f)
... ... def itemPropertyUpload(flatfile, export):
131 131 if(re.search(r'(cotton|baumwolle)', if(re.search(r'(cotton|baumwolle)',
132 132 row['outer_material_type'].lower())): row['outer_material_type'].lower())):
133 133
134 material[row['item_sku']] = 4
134 material[row['item_sku']] = 7
135 135 value[row['item_sku']] = "Baumwolle" value[row['item_sku']] = "Baumwolle"
136 136 if(re.search(r'(hemp|hanf)', if(re.search(r'(hemp|hanf)',
137 137 row['outer_material_type'].lower())): row['outer_material_type'].lower())):
138 138
139 material[row['item_sku']] = 5
139 material[row['item_sku']] = 7
140 140 value[row['item_sku']] = "Hanf" value[row['item_sku']] = "Hanf"
141 141 if(re.search(r'(viskose|viscose)', if(re.search(r'(viskose|viscose)',
142 142 row['outer_material_type'].lower())): row['outer_material_type'].lower())):
143 143
144 material[row['item_sku']] = 6
145 value[row['item_sku']] = "Viskose"
144 material[row['item_sku']] = 7
145 value[row['item_sku']] = "Viskose"
146 146
147 147 with open(export, mode='r') as item: with open(export, mode='r') as item:
148 148 reader = csv.DictReader(item, delimiter=';', lineterminator='\n') reader = csv.DictReader(item, delimiter=';', lineterminator='\n')
 
... ... def itemPropertyUpload(flatfile, export):
153 153 Data = {} Data = {}
154 154 for row in reader: for row in reader:
155 155 if(row['AttributeValueSetID'] == ''): if(row['AttributeValueSetID'] == ''):
156 values = ['3',
157 row['ItemID'],
158 row['VariationName'],
159 'de',
160 'PANASIAM']
161
162 Data[row['VariationNumber'] + '1'] = dict(zip(column_names,
163 values))
164 156 if(row['VariationNumber'] in [*material]): if(row['VariationNumber'] in [*material]):
165 157 values = [material[row['VariationNumber']], values = [material[row['VariationNumber']],
166 158 row['ItemID'], row['ItemID'],
 
... ... def itemPropertyUpload(flatfile, export):
168 160 'de', 'de',
169 161 value[row['VariationNumber']]] value[row['VariationNumber']]]
170 162
171 Data[row['VariationNumber'] + '2'] = dict(zip(column_names,
163 Data[row['VariationNumber'] + '1'] = dict(zip(column_names,
172 164 values)) values))
173 165 variation_upload.writeCSV(Data, "property", column_names) variation_upload.writeCSV(Data, "property", column_names)
Hints:
Before first commit, do not forget to setup your git environment:
git config --global user.name "your_name_here"
git config --global user.email "your@email_here"

Clone this repository using HTTP(S):
git clone https://rocketgit.com/user/initBasti/Amazon2PlentySync

Clone this repository using ssh (do not forget to upload a key first):
git clone ssh://rocketgit@ssh.rocketgit.com/user/initBasti/Amazon2PlentySync

Clone this repository using git:
git clone git://git.rocketgit.com/user/initBasti/Amazon2PlentySync

You are allowed to anonymously push to this repository.
This means that your pushed commits will automatically be transformed into a merge request:
... clone the repository ...
... make some changes and some commits ...
git push origin main