渣总滑块和点选图片坐标计算(rust版)
RUST-FFI
背景介绍
最近get到了一个新的知识点,rust可以通过FFI(Foreign Function Interface),来编写Python的方法和函数, 然后就想到了用rust来实现一下渣总的滑块和点选图片坐标计算的算法,因为之前py的位运算是在太诡异的原因,我一直是拒绝用py来实现的,现在突然发现rust可以直接构建生成Python可以直接调用的package了,算是个小惊喜,也当学习一下这个用法了。在这里感谢渣总提供的Java版本的算法,渣总yyds。
FFI过程
运行效果
样例图片选自猿人学联系平台(https://www.python-spider.com/), 这里图片使用获得了网站作者的同意, 在这里感谢平哥和卞大。
这里仅用一个图片作为样例, 该样例仅做本文演示研究学习使用, 显示图片如下:
样例图片
对于这个算法有两个概念, 第一个是背景图, 第二个是挑战图, 挑战图如上图所示, 背景图是指的去掉阴影之后的原图, 因此这个算法要求图片是有限的,我这里先提前生成好了所对应的背景图了, 如果没有的话有两个方法。
- 调用
image_magic.avg_b64()
函数获取到背景图 - 找一个朋友,要一下所有的背景图(^.^), 用这个方法需要有一个朋友
# -*- coding: utf-8 -*- # @Author : __LittleQ__ # @FileName: main.py import base64 import os from io import BytesIO import image_magic import numpy as np import requests from PIL import Image from cv2 import cv2 from skimage.metrics import structural_similarity as ssim def base64_cv2(base64_str): sbuf = BytesIO() sbuf.write(base64.b64decode(base64_str)) pimg = Image.open(sbuf) return cv2.cvtColor(np.array(pimg), cv2.COLOR_RGB2BGR) class ImageSimilar: def __init__(self, image_path): self.image_path = image_path with open(image_path, "rb") as f: self.image_base64 = base64.b64encode(f.read()) self.image = cv2.imread(image_path) def find_similar_image(source, cg): target = None max_similar = 0 for mask in source: similar = ssim(mask.image, cg, multichannel=True) if max_similar < similar: max_similar = similar target = mask return target if __name__ == '__main__': images_list = os.listdir("./masks") masks = [] for image in images_list: masks.append(ImageSimilar(f"./masks/{image}")) data = { "img1": "/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAgGBgcGBQgHBwcJCQgKDBQNDAsLDBkSEw8UHRofHh0aHBwgJC4nICIsIxwcKDcpLDAxNDQ0Hyc5PTgyPC4zNDL/2wBDAQkJCQwLDBgNDRgyIRwhMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjL/wAARCACWASwDASIAAhEBAxEB/8QAHwAAAQUBAQEBAQEAAAAAAAAAAAECAwQFBgcICQoL/8QAtRAAAgEDAwIEAwUFBAQAAAF9AQIDAAQRBRIhMUEGE1FhByJxFDKBkaEII0KxwRVS0fAkM2JyggkKFhcYGRolJicoKSo0NTY3ODk6Q0RFRkdISUpTVFVWV1hZWmNkZWZnaGlqc3R1dnd4eXqDhIWGh4iJipKTlJWWl5iZmqKjpKWmp6ipqrKztLW2t7i5usLDxMXGx8jJytLT1NXW19jZ2uHi4+Tl5ufo6erx8vP09fb3+Pn6/8QAHwEAAwEBAQEBAQEBAQAAAAAAAAECAwQFBgcICQoL/8QAtREAAgECBAQDBAcFBAQAAQJ3AAECAxEEBSExBhJBUQdhcRMiMoEIFEKRobHBCSMzUvAVYnLRChYkNOEl8RcYGRomJygpKjU2Nzg5OkNERUZHSElKU1RVVldYWVpjZGVmZ2hpanN0dXZ3eHl6goOEhYaHiImKkpOUlZaXmJmaoqOkpaanqKmqsrO0tba3uLm6wsPExcbHyMnK0tPU1dbX2Nna4uPk5ebn6Onq8vP09fb3+Pn6/9oADAMBAAIRAxEAPwAtIo3kxJwMdavFQm2T7ynjBqpEoA5q0g3EA9K+Oqs/SKqblcr3SpAnmdXPA9qycl3yTV/UG/elewHAqrGnGe5qobHTRVo3Y5FzwKsRxc0Qx1dji71lOokTUqWYyCL5ulX4rfJBxg1asrOObjkMP1rU+yIqgY6V51XEWZ5lbFLmsZ8Nv0GK1baPy0wOtEcIBGBV6KE7eR1rz6mIuzzq9a4wKSOOlKIQRVqOEntTxFsOetcrrnE6ttjPaEUzya1GhDjK8GojFTVUcaxmmPHUVBJEM9K1Xi45FQtCp7VtGqbwqmLc7YYmdhkD0rMk1CHptJyPyrX1yJU02Q5wK46Ar8rsjqHYjDHp15/SvYwkVUhdnfRkpRuy1eXXmIoUEc5IrEu9RD4SIEEHBzW7byWrQtGYyx5O89jWK+lPLfttI2k5/OvToKEXqb87s+VFNMu3Qkmr9vat1b8q0TpZs8KVA4znvSqgXitXWvojopbXIxEMUEYqUsuSAeR1qJiCaqJ0KRGy7mFSJCDSoMmp415obHKVkKsIFTxqB2p4jzUsUOT0rCbOac+46KPceauRxjPSlgg6cVcSDkVk5HDUqiRxAkZ6VMIBz19qlSKrSR5Ws3I4Z1NSmtv3pwgxV9YeOlO8jHWs3IydYorD7Uoh9qviEU4QA9BUuRDrFEQe1IbYk1prDt7cVKI48crS5jN17HjLTMWGOMVs6Z5dywEsgQdzWV5OCacu5PuEg16FWHMj7KrFVI2Wg/UIw1+yxHeOmRQts6MAwwfSpLXck6uR35rdOnNdyrMjDYa5qlRU1ZmFSv7FKDMqKDBBq/FBkCr0+nsJFKp8gGKlhtjwMYrzquITV0cUsUmrofarsX5Rhu5rQjQsOeaSG3OOlaEMWABivLrVU2eXWqpsZFbnNWlTFPC4p2K4XO5xSncQDFIRTwBVe7u4LNA8zYBpRTk7IzV2yUcGh15yKht7uG5z5ZOR2IxVjHFOSlB2eg9UyBlzUbR4qyUzVHVdRtdIs3ubyVURRnk8n6VrS55yUYq7ZakYnixlh0N2bPLgDFcdFdxSQBrmMxkHamfuk1m+LPHS62scNtEUtom3sSeWPYVn6NdjUrh4pBnc25AOxr7HB4KpRwydTRnTQrNe6dMIyW3qPLBHGOhqSPdG+SPmPeo2sHtZ0MM5I4yo54q9HakhSc9s0nKJ61ObKs7zykbST2JNZqvP9oYO5G010SQKJRleM1HqkKtI4RVJDDkVdKrG9rFTquMlFGMW/esUJwfWnqDmrMVnk9DUv2Tb2NdPOjppuxHCharsVswAYrwehpbK237uD1xWvDbfKq84FDehnXr20RRWGrUEQq6tr7VNFaEHOK55M4qmITI4YvariRc9Kkihx2q0sRHasJM4alXUiSEelWVhp8ae1ThM1i2cc6juRrFgYpfLzU4WnBahsxc2QiMCnhBTwnNPAqWS5shMZNJ5VWMUbaSYuZnjflCnLEPSrPl+1Sxwn6V6dSskj7h1LIjhgywAHWuhs9lvGELZJrPht9vJq9DEcDrXk4mrzs87ET5y1PuZlVTxinSK0VnM6nDLGzA47gVLEmSMjpU1zH/xLrn/AK5P/I15UqnvJHmVJqKPOo/FWubR/pvb/nkn+FSjxbrgHF9/5CT/AOJrEQfKPpTuBXvvD0W/gX3I8lzl3Nv/AIS/Xf8An+/8hJ/8TXot9qItCFEe5vrivF5boJ05ru9Tnuby8OXIAIAx71xYvBQco2SS1/Q7MHRdWXvbG7H4kQy7HiAHfB6Vmaxei9ugIzmPoKyooijA7jv9auxQ+ZIGPQcmslh6dN80T1FhoU5XRKt+1o6yoQCnXJ611Gn6nbajHugkDEfeAPSuD1ADeNhznjNYcd1eaVcTTWk5RmHQdK0lgo4lXvZir4RVI80dz0vX/E2n+HbUy3co8wg7IweWNeH67r1/4ku5JLiRggJKRjoB6VBdDUNVvjcXLvLI55Jrf0Xw5cXTIZEKRD7zEfyr1sLg6GAhzN3l3/yPMhTk9zndJ0VtQeRN5VgOBtyCfrXe6LoUGlQ7Y18yY43SH09BW1ZaPBap5VvGFXue5rVgsivO32rnxWac/urY7adOFPV7mTHZ7P4c+9TtDtUe1axs29MD2qGa2LcY4rzfrN3udKrXZniEMuaia2+cnHWtEWxHrTzCCOlbQr6lKojL+zgUhhrTMVRPEK7KeI7mkapHYxABvrWjGgBqnGVhUsSAM1dhcSBWXoa7faXVzmqyuy9FGuBVgIPSoI6tp0rCVRM4KjdwWMCpFWgVIKwcjBtjlGKkApgp4NZtmUhwpwpuaUGkTYdQKbkUZFImw/NGaZkUZFILHm4gPpTwqL15q1ICSABURiOelQ6rZ9T7S5Ytwso64x61owxA9CKxgrA96ljkki+6xFcdRX6mFSm5bM6KKEcU68QLptz/ANcn/kaw0vJ0O4OaLvVpJ4HhJwHUqSvXmuR0JOSdzhnhastEecZCoCTwBVOa7B+VenrXUN4bsXUKZ7rH++v/AMTUY8Kad/z2uv8Avpf/AImvpI4qgtW39xlDLqu7OV3gj5hmvWJY4ZG3A4PpiuTHhPTz/wAtrr/vpf8A4muiEhrjxteNXl9n0udlDDTp3LMOm+adyuMd6tHTzFE+3niqCTyL91iPoalF3PjBkNefJzfU0lCrfcoLYFlO5SRnIqlNpCiYO0ZIz0ra+0Sj+L9KcJ3I5Cmto15xN41KkdjIg8MW/wBsMr8RjkLW0IoYkClgqgcAVBJM7fxYHtVZ2A6tROpOpuzJUXLU1Ibyzj67uPaiTWYkf93HlR3NYrSp61E0gOcVKoJ7miwcZbnT22r21ydjjyz2z0q2YEblWBriTJini+uEGEmcD2NTLCa+67ESy+V/cZ2DQgDnFQFUOcMDj0Ncq17cyD5p5D/wKmrLIOBIwz6GtI4ZrqEcFUW7OpaMYyKhZKybW/miAUtuX0NasNzFMOuG9DV+9AidOVPcp3sJkEcXIVjkke1aNrGEhQAEADgVUu2jWVNzMOONtXLSRZFwr7wO/cV3Qqt0jnnLUux8VaRqqLUymsucwmrlkEU8Gq4anbqq9zJxLAanbqrh6XfQQ4lgNS7qr76XfSFyk++jdUG+jfQLlJ91YF/4ia2u3hjj3BeCT61sb6x77SIbu5M24KSOauG5rRjHm94qbV9KML3FRiVc4zTJJgDivKvI9ZRZMTGO1RPNEo4xmqUsrZqElj61Sg3uaxpX3LE10TwCBVUtk9aTY5PSlEbE9K0UUjeMUg3U9SDSrAT61KtsfSk2kDkkIMU4H2qVbY+hqeO1PpWbmkYyqRRWGT2p2D6VfFrjtQ0YFZOojJ1U3oZ7ZUZqqZXBOK0JI8moHh5rWMkaxkupUZ3bqaicE1cMNNNvxWikjaMome2RULFq0WgqPyPatYzRtGcTPO71pOautBz0pphArRTRoqkSupNSryak8oAUo2jrT5kDmia2iDyAE4HrV4QFXwOR6iobLy/NBf7takxRVBXOAeuOKyldnm16j57CW9uG5Zc/WrSWoiuN6DarLyB60y3cBD71cLruCk4JHFaQTcbI82rL32Ko4pc4prOqDJao3lUDOahXTsxK7LAejzKqCTNL5orVA4Frf70u+qu/il3j1q7C5C15nvSebVbf703zPegr2aLnm0ebVXzfejzfeiwvZokurwW9u8p/hGa46bUrqWVn851yegPSuj1BTcWUsa/eI4rjGJRircMDyDW9KOh6OBoU5J8xp2V2DMBIpPvV6a5gZ/lU++Kxxc26k7ST71EdSMUuY1Uj1Nd1TKOZ2OuVDmd4o3AiOQwH50pQscKnSsWPU5kyxwwPalbWrgsBGiKPT1rkq5VKl5kOhUWxtrAxOCMU8W4z1qna6wsi4uE2MP7vNaMV7aEqd/X9K8qrSnF2sc83Ujo0TxWyGrSWyDtUME8UoPlsCBxVpTXBUU0zjnKQeQg7UjIqDipM8VhTa0sOoSI5JiXgbaUKU5vQmEZzehqmoiuazz4gsu4k/KkPiGyHaT8qv6vV7Gyp1F0LzQk9qabf2qp/wkVkO0n5VBdeIbcxgw7wQecjtVww9Vu1ilGr2L3kH0prQHmrMcqSxLIh3KwyDSOcClZp2YKbKLQn0phiwOatMc1Xdsgit4RujaMmU5flNVZZPStEosgwetV2iVW+YcV2UqDkdNOa6lF7gBOnNQ/aVON3SrksMZbgVWaGLbgLz616VDBN7nTFxZo27QLGJGb5fT1qU6xBnyhnnpkVmCVI4ggHI9KSOUMQzRHOOK9eOWrlTfQ53QUndnQWl0ZLd3SPftOMjtUi6gBOvmj7w6elZOiam9lHcxiPd5jdx0rQuIZGCuI88Z4onhqdOLckrHBUornknonsaV7dWm1Np7c471VM6SL8oOAazyg3qWU89c1pwGLZgAV4Ff6u56KxCpKnFJO4x7naMAVEt0W9jUzFecqKrOED5qFKkl7prFLsSm4wRzT1uMnrmqTso60KwCg1fJoVyI0A5pN/vVXzxt6800S571DiT7JlzfR5mKqedS+aKLB7OxZ82q8ltbSuWeIFj3pvmA96N49aNUUoNbM5yTS7u3fy5IwN3Gc5pJtLljb7ynA6V0Mrfa5+vuKry+VuIc5I4r06OcxqStU0O2OJqaX3Oe8uRXClT+VTNECBjr34raUQ4+T71PSyW6bKgADrgda6sRXgopqSdzR4trVqxlwRnPSrUcQDZxip3h8uTy+Rirf2aMw5JGa82sk1zGM63V9StCpT7rYz6GriaqkR2s27FQrAig5krC1bSfOfzrS9eKTPzqDwa5I4eNR+/oc8lCe5uXesSSxlIcqD/FWKwJOTWauk6jyW1B6jbS9Q3n/iYv1reGGo01pJXKp2h8KNJk9aZ5QzwQfxrNfTL8j/AI/3/OmjRtQ2n/iYnB/iz0qlGnb4jT2suxqsqrwcVEWUHqKoDRLspltSkP40xtHmAIOoSZ+tOKpr7Qe0l2ZtWmpT2H+qYPH/AM82PH4VpReJbeR9kytGcdcZFccNHl76hL/31VzT9Nghm8y+uppAp+Rd3B+tKphcPOPM3qYygt7Hcq6yRhlOQRkGq0qlRSQ3EcyKtuynjoKikuQsbo33un0rzvZ2lZEIIZwr5PI9alvHUJkdxxVKBGlcKBhafdFYhsZs46c16NKBrCN2V9x9MmpFhBjVtjZParlqkKCNmTcXHU9qIb7bMUYKAD0Ir06L5fU0lUb0gtiGLTnecb12L1Ymt1NJs5Y4wrDAH3xVa5YpB8z7iRk1St9XjhHkk49K9KlKT+I5Je1rK8XsaDRQ2rtGqgjPBxVlp1ht8jHSsKa+G9Tnkg8VFd6hiDlu1en7GPstRrDSlbmNG01ASylXUHJxV25sVELSxkhuvFcrp1yhQyM+CGwBXURakDbhSOMdTXh18DCsvhFiKUoSTgUv3iqCTu9qqvcZcjpin31zsj3Qnoe9Yk1y6qJjglicivKxGXUqc0ov5HTQg5Fue9/0hVAOxD8z9gfQ1N9pzKFc7M9O4/Oq0N7aJYGIxmQvkucdTWTNeT2zCVImNtuwd/J/CtPYpq0UKbUH72h0hJxkEH3FPUfLnNYVpq9ndMBFL5bf3CcVoi5YLgjI9RWcsOkUnzK6LZcetLvGKqGUEZpPNHY1jKnY09nfcteaBS+YDVUyDFJ5hpezH7NFL7ZIsgYHBqVp2di/c1lmbYgHVs09bpgMbaxeDl2O10Vuka9pMrTKGIGTW+rxWxCowOa4kTkEECrA1Z4znyT/AN9VjUwNSWlzkr4Vyemx1zrBMd7jBXrXJ3HiDyZHRV34Ygc0kmuTOMBNp+tZstmdSkWMSGNWPzECu3C0JwVqmpzRoyp3ctjQn1uSWEKYtj+oNQm/vdq7Fj+p71Yt/C6xxmNp3kI5DZ61S1KxVY4/LDLsGCd3WtnSd7LYlSV7XHNqGpfd2x564waZHd6i0n3Y/Q9aoeWI5PNV2LehNVxdLDIxMbknr89KOEvskU7rqdCft7qNhg5571E41MAjfbDHUbuaxU1UJJuETDH+3Ui3xuJDII8AHkF+tdKwVJLU5XOpbQ0J5L+FRvaEA9DmokkvG5LRbfXNZ99fwyxJGYWyuc5frVdLqLGFRgvoXqJYRr4TSLl10N1kutud8XPvUQe6R8ebCDjoTWaNQWOPa0ZbPT5ulQSTgziQhgOOCfSs1hpdSo8zerOrsJZrSXzcq7fkKvtqWWMjwIX9a5I6lExyYSeOz0rXqMhAibB/2653g5Slc39nF9DpJtdEhRdpVzwcdKRp0IJkcHuOa5m2MfmYZTg+9aW5FGFTgj1rshhowdzoSpxVkakerxm4RGkUL0HNXtRihVkaKVSW5xXPRRosqyBcEHNadxdJchQIdrDvmtlFc6ZE4+8nHbqW4tWjWyW3lLNNnGe2KzLhfOZmh6g4qUaeZJ3dHBUJuHHWq9mk9u5G/IY4ORXU66Vki6MYxTcSJr5jIkZOX6cUmo3LLayYzlQM/nUq27R3HnIVDBu4pt7BJevMxZYycZG3PeutYxcqiwqc0r8q0Muwu5GlVUPJPX0rtoh/oI53MvJNYFhpywCJ3YOwbrtxmtqLWI7VZE+zBskjJatY4qile5jKM7aK7K91cBj5e4KByapM+nkbiSx9FPGasf2mo8wNbI4b17VWN9FEuI7RVyc8GvHnShWm5zlZl8k+iCGcGQeRaTYHAwvFJrMc9vAsgAG4jMbHrVKa9YuWXcoPbdUN1fT3KBHfIGMVpQUKb953QqmFry1TTKMi2kqnzEeBz3qxay3ts2be489PTOaHmRwA8ecDHWpY7q3hOVtgp/vA1dXEQlsiI4GvHXlLketfMEu4Hhb1I4q/HcxyKGRgw9RWZJqqTR/vEV1XswpZPEEW1US1UKvocVwzaktI6nTTp1lpKJsIXkPA4qwtpIwzvArEi8RLsGLYDI/vVcj8QAp/qR/31XHNVb6IJQqP4UZm3PJpWbbin/fbCjk082xC813nTKZW8zJ4zTllKjpn61YSEZxs61O2nq4KhkRvQmkkc06j2HWFvHejGwA5wa0rayht7kIc5z26VnaXHJa6hGHZM54G7g10Z8r7RGWK7yRwKJLU8qvUqqTvsSRWr/apHDAgL930rIubNvmV1yDXRo6NdyKqjeVyRmqjlQTuIFS2kcim5M4G5tmhkK9wapywrIORz6iu5l0/T7hyzzcmqU/hywcZju3B9quNRdzuhUbXLJHBSg28uWXco7etVTcOWJHyjPQV2dxocEORI7up7gVh6rpNtbRLJblyxPKtWyqRelwnRajzJGO05fryRTVl5zwKjkBTJXr3HpVffgdea2SuYKZoJMpI3nPNWsLcMArDI7Vjo43DmrULHIMWS+e1TKNjRVC5na+G6jsKmVtzADHNC28kgBkwh9TSrHFEw53nv7VkaRmTxxP2Hfg1oJHJtGRVeO8CAJhRjoBVyG580HA5FZybNlIkCMq5JqeA7c8nNVnmCkZ70qzLn5Sc1KbN4aq5pxTtGyMn3l7+1KXBcyMuWznjpVW3lBHSpDJg1a7lJK4yUgtlwfXimtNndgfeps0wLZNVzMM0HZGmmtS3Hc7Qqn+E5pJ2yxYdzmqEkoJzSxyGRhzxUuTK9gl7yLS4PWiRQQeajLbTURlw5OeKE2zkqO0itNxVctwamvODkdKzmkPrTOmk00T7qVHXd8x4xVTzTnrTg1JndCV1qWCSYxsAww5NRSso2hTkjg0mAR3pNoxwKnmQOm9x0cqLEAeozmp4pF2ZzjPIqiww27FSp5TLk8H0o0Zyu6Z0Q+QEjrnFGZHOA350UVRxUZNptjfLl6+ZyKqTSymT55CT6iiikm7mkW3uWtOlLXcatyVPBNdLC2b6Ef7Q/nRRSlucGYK0NDag/wCQ3cD0irJUlt2fWiisq2x5GH/iopzLtY4oilK8etFFckj3HsaMcKyoxYZAXNchrl4i7VEQ69TRRWtFe8jnhqpXOdaNZmJIxn0rMurYxzhFYYb1oor1U2jlnFEqaeqJlmLGr9quBhQqhRz70UU3ruZokknPzei9RUJVWj8w7gCeAD0ooqDRBDdfPsK5YDg1atrmVrldpC7jzRRQlobRZeuFLSkFjlaajHfheMCiikdUHoXY5CoA9amDE9aKKRrHciuD0qqT3ooqT0aexEzc1Pb8CiioZpL4RJZDM5iX5cdTVZi1o4DHehOMd6KKqJwVUrXJJhuh57VkOfnIoopl0NiMnBp6vxRRSZ2RZdS3JmEe/qM5xTRyKKK55bmtOTejBbZ51yrqo9MU/wDs5/8Anov5UUVUThrtqWh//9k=", "img2": "iVBORw0KGgoAAAANSUhEUgAAABkAAACWCAYAAAArDyNuAAAD20lEQVR4nO2Wy47jyBFFz41Mqnvg//+1+YQB7Ebb1VMPkSIz7iwyqWJj2jtvDOQBVKKKVGRE3HgIJpPJZDKZTCaTyWTyv0WP32/OSIiEIrwIqpESCQpCIZBRCBWwwBiTEAIBmHExCIIAREUGJamEACKggMLjWSEJCSjDoPs9M+wO28LdA0AIzkOCAiQUoIDLOKycsSZI/bsyyOQZiU7PNY4wUj/g6QCihgJLlAhcTIYhQDGc1jjo4rnHdXc6xo0zINPDPiMyVYJSoutRcniaVwc/U63zj6//+LXY57sgCKMikBCBJEIxxP7ZzH83eX0ini9JoKBSwPGZhh7p1br/burU4hnZz4H/7FFSWeheV6B4lKjxmbLrN4YDVl4S1jUwEEOsJMn08M9Uip/VpAIOQ/QeucYwKhM/de59IYSyYBt5GM7+aq07Ux1GZ7oEKXctLgJoiNMrqrchw/OggCs6Wj+kJbGZtge0hfe3B9UxDEbvgWffPovo8zSfnT3uBUIu0AIeCQe0TbQP2FazP8z7nx7CjyZDJvSZGumZ9TFKRpIMSqGs0IAt8QrtDtsHHGtw/0i2jwctlx4JMhppYjShzgmhPkV8CUcOogk14TXJjyRX2N9NW03bC74HuYpSblQXI+tZf1ctfoUsohW0B37A/tbY3w54iOMOPgqlVXQkJUUgqjVU+KnIf2FcXQgDaobdsENbEz/EjYVSsz8RSS5QCbCpBRMjVb4UrU9t1HOVe45pUNHjhlfwvZH3HpFKYVGBxeCgUMgDskENGAPwrP7+njoPGsVwQGQQDnj7il/N477R1iSPZOXgVheWZUEE5VagimxJJVrvjav3l+te3ZVKwZtp7wkvD463YD+SPAzZx3+zKdkXXkgoggihx3auoDHDTg2AaFDyho4v8NrYf2ysL4nfAu0FDHbiMd01RryAkCilsCyV+pxQ8Wlc6n1Qjq/ofqO9bOwvO8dro91Bj4acBDGGavda12Fp4+PgOJKqNtbruW09Vm0L9L6Q/0m2bxvHq/EKcQilh08mQkT0iovoo73XYeJ016TmDSl6kyWfQjwgv+9s/9rZX422oOxCB0g+FywRQJyf/VzXp9cRUH2vKCui9Jw1cCbtY2f958rjW+vatKC00sc5bRw0bD1z7j4eNNb3+PFR//3HB3Vd+G35B1+/3MgGf75/sP64c3tpfLkLIwoaM1SECoERidqlUgCcnxoPL6q/m3ZP9jhQ3WhHkutGWU3ZCpGjYhQ85ygiLs5/7unzRwSQ5zidTCaTyWQymUwmk8lkMplMJpPJZDKZTCaTyWQymUwm/wf8Be67GBT3HdtgAAAAAElFTkSuQmCC" } cg_image = base64_cv2(data['img1']) bg_image = find_similar_image(masks, cg_image) result = image_magic.top_n(bg_image.image_base64.decode("utf-8"), data['img1'], 25, 1) left_top = (result[0].get_x() - 13, result[0].get_y() - 13) right_bottom = (result[0].get_x() + 13, result[0].get_y() + 13) point_color = (255, 0, 0) # BGR thickness = 1 line_type = 4 cv2.rectangle(cg_image, left_top, right_bottom, point_color, thickness, line_type) cv2.imwrite("./xxx.png", cg_image) print(result[0].get_x(), result[0].get_y())
这里对于图片相似度匹配算法采用了scikit-image
这一个库,由于py当中本身有比较好的实现, 因此对于感知哈希算法和直方图算法这里就不用rust还原了,直接用Python相应功能的库即可实现,本文使用的背景图可以文末的连接当中找到。调用算法之后结果输出如下:
算法输出
在这里感谢再次感谢猿人学平台所提供的图片样例, 这里只用一张图演示了,希望其他使用者在学习过程中如果要复现上面的样例, 建议保存图片验证, 「注意猿人学平台网站压力」 , 「合理合规」 使用爬虫技术。
代码实现
这里FFI层选择 「PyO3」 这个库, 这个库就我的体验来说还是比较容易上手的, 先来看一个简单的例子,也就是对于程序员都比较熟悉的 「hello world」 程序。
先来配置一下依赖, 在Cargo.toml
添加如下依赖:
[package] name = "image-magic" version = "0.1.0" edition = "2021" # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html [lib] name = "image_magic" crate-type = ["cdylib"] [dependencies] pyo3 = { version = "0.14", features = ["extension-module"] } [build-dependencies] pyo3-build-config = "0.14" # Python构建所用的库
对于如何去建立一个rust项目,如何安装rust之类的,在本文就不去讲了,网上的相关资料也不少,可以自行搜索解决。这里需要注意的一点是需要定义lib的名字和类型,这里我采用和渣总同样的命名, 就叫image_magic
吧,然后Python调用的时候就会使用这个名字, 也就是import image_magic
,crate-type是cdylib
。
然后需要配置一个build.rs
, 来处理build的脚本。
fn main() { println!("cargo:rerun-if-changed=build.rs"); pyo3_build_config::add_extension_module_link_args(); }
之后,创建一个目录image_magic
在这个目录下面写一个__init__.py
, 内容如下:
from .image_magic import *
配置好了,就可以写一个demo来试试咱们的配置成不成功了,在src/lib.rs
添加如下内容:
use pyo3::{prelude::*}; #[pyfunction] pub fn demo_py_function() -> PyResult<String> { return PyResult::Ok(String::from("hello rust ffi!")); } #[pymodule] fn image_magic(_py: Python, m: &PyModule) -> PyResult<()> { m.add_function(wrap_pyfunction!(demo_py_function, m)?)?; Ok(()) }
然后验证一下,咱们的函数是否好用,这里需要装一个py的依赖maturin
和ipython
,直接pip安装就好了。
然后执行命令maturin develop
, 之后就可以直接在Python的控制台当中使用这个package了。
python调用rust样例
这里hello word跑通了,那就可以放心大胆的去搞rust实现了。
ImageAvgMerger
先来实现一个简单的,这个函数作用是将背景相同的图片提取出来背景,相似图合并,求相似图的最真图,这个大致原理应该是计算图片的平均像素,选取前3/4的图像,剩余的作为噪点舍去,具体算法细节就不在这里描述了, 这段代码看java还是比较容易理解的,对应rust实现代码如下:
这里需要安装图像处理相关的库, 具体需要安装的库如下, 由于我比较懒,这里也没处理Python和rust图片流的交互,这里干脆就用一个折中的方案,直接用通用的base64进行处理,对于另一个hill top算法也是用的base64, 后文就不解释了:
[dependencies] base64 = "0.13" # base64 编码/解码 image = "0.23" # 处理图片 rustc-serialize = "0.3.22"
use image::{Rgba, Pixel, DynamicImage, GenericImageView, GenericImage}; use std::collections::{BTreeMap}; use crate::image_utils::rgb_diff; #[derive(Copy, Clone, Debug)] struct RGBA { r: u64, g: u64, b: u64, p: u64, total_record: u64, } impl RGBA { pub fn new() -> RGBA { RGBA { r: 0, g: 0, b: 0, p: 0, total_record: 0, } } pub fn set_val(&mut self, val: Rgba<u8>) { self.total_record += 1; self.r += val[0] as u64; self.g += val[1] as u64; self.b += val[2] as u64; self.p += val[3] as u64; } pub fn avg_rgb(&self) -> Rgba<u8> { Rgba::from_channels( (self.r / self.total_record) as u8, (self.g / self.total_record) as u8, (self.b / self.total_record) as u8, (self.p / self.total_record) as u8, ) } } pub(crate) fn avg(input: &Vec<DynamicImage>) -> DynamicImage { let mut width_total: u64 = 0; let mut height_total: u64 = 0; for img in input { width_total += img.width() as u64; height_total += img.height() as u64; } let width = (width_total / input.len() as u64) as u32; let height = (height_total / input.len() as u64) as u32; // println!("width = {}, height = {}", width, height); let mut points = vec![vec![RGBA::new(); height as usize]; width as usize]; for img in input { let mut fixed = img.clone(); if (img.width() != width) || (img.height() != height) { fixed = fixed.thumbnail(width, height); } for i in 0..width { for j in 0..height { let val = fixed.get_pixel(i as u32, j as u32); points[i as usize][j as usize].set_val(val); } } } let mut first_avg_img = vec![vec![Rgba::from([0, 0, 0, 0]); height as usize]; width as usize]; for i in 0..width { for j in 0..height { first_avg_img[i as usize][j as usize] = points[i as usize][j as usize].avg_rgb(); } } let mut output: DynamicImage = DynamicImage::new_rgba8(width as u32, height as u32); for i in 0..width { for j in 0..height { let mut top_point: BTreeMap<u64, Rgba<u8>> = BTreeMap::new(); let mut index = 0; for img in input { let rgb_diff = rgb_diff(img.get_pixel(i, j), first_avg_img[i as usize][j as usize]); top_point.insert((((rgb_diff as u64) << 32) + index) as u64, img.get_pixel(i, j)); index += 1; } let avg_point_size = (input.len() as f64 * 0.85) as u32; let mut avg_point_index = 0; let mut rgba = RGBA::new(); for key in top_point.keys() { rgba.set_val(*top_point.get(key).unwrap()); avg_point_index += 1; if avg_point_index >= avg_point_size { break; } } output.put_pixel(i, j, rgba.avg_rgb()); } } output } #[cfg(test)] mod tests { use crate::image_avg_merger::{RGBA, avg}; use image::{Rgba, Pixel}; use base64::{decode, encode}; #[test] fn test_rgba() { let mut rgba = RGBA::new(); rgba.set_val(Rgba::from_channels(123, 123, 0, 0)); println!("{}", rgba.r) } #[test] fn test_avg() { let mut input = vec![]; let img = image::open("./src/images/0.jpg").unwrap(); input.push(img); let img = image::open("./src/images/1.jpg").unwrap(); input.push(img); let img = image::open("./src/images/2.jpg").unwrap(); input.push(img); let img = image::open("./src/images/3.jpg").unwrap(); input.push(img); let output = avg(&input); output.save("./src/output.jpg").unwrap(); } }
这样,这个函数就实现完成了,简单写个测试,发现这个是好用的,然后转换成py的接口, 在src/lib.rs
里面添加如下代码。
#[pyfunction] pub fn avg_b64(input: &PyList) -> PyResult<String> { let mut image_input = vec![]; for src in input { let target = decode(src.to_string()).unwrap(); let img = image::load_from_memory(&target).unwrap(); image_input.push(img); } let result = image_avg_merger::avg(&image_input); let mut buf = vec![]; result.write_to(&mut buf, image::ImageOutputFormat::Png).unwrap(); return PyResult::Ok(base64::encode(&buf)); } // 告诉image_magic那些函数是可以被调用的 #[pymodule] fn image_magic(_py: Python, m: &PyModule) -> PyResult<()> { m.add_function(wrap_pyfunction!(avg_b64, m)?)?; // 添加上面的函数 Ok(()) }
现在编写完了, 写个代码测试一下效果如何,
def test_avg_b64(): image_list = [] for i in range(4): with open(f"./images/{i}.jpg", "rb") as f: image = base64.b64encode(f.read()).decode("utf-8") image_list.append(image) result = image_magic.avg_b64(image_list) with open("./avg_b64.png", "wb") as f: f.write(base64.b64decode(result)) if __name__ == '__main__': test_avg_b64()
简单试了一下,这是好用的,接下来实现整个滑块/点选算法的核心。
ImageHilltopV2
实话说,这个算法我也没太理解具体的原理,我对着Java的代码直接改写的rust, 因此本文不去讲解这个算法具体的原理了,我记得渣总有篇博客说过这个算法的原理,但是博客好像是访问不了了,连接也放到参考资料[4]里面了,如果能打开的话可以去看看,这个想法是真的nb, 再次膜拜一下渣总。
直接来看具体的代码吧, 这里我把原始代码里面保存图片的功能直接去掉了,目前这个算法实现只能是返回坐标, 如果有其他需求的大佬,自己去改改源码吧,我这就不完整都实现了,有些特殊情况可能有问题,我没找到合适的图去复现,有问题的地方,我都添加注释了,有问题大佬们自行修改一下代码吧。
use pyo3::{prelude::*}; use std::f64::consts::{SQRT_2, PI}; use image::{DynamicImage, GenericImageView}; use crate::image_utils::rgb_diff; use std::cmp::{min, max}; // 这里需要用到Point这个class, 按照py的方式导出 #[pyclass] #[derive(Copy, Clone, Debug)] pub struct Point { x: usize, y: usize, weight: usize, } impl Point { fn new(x: usize, y: usize, weight: usize) -> Point { Point { x, y, weight } } } // 提供两个方法获取坐标的值 #[pymethods] impl Point { pub fn get_x(&self) -> PyResult<u32> { PyResult::Ok(self.x as u32) } pub fn get_y(&self) -> PyResult<u32> { PyResult::Ok(self.y as u32) } } pub struct HilltopParamAndResult { background_image: DynamicImage, challenge_image: DynamicImage, ch_size: u32, top_n: usize, avg_diff: u32, } impl HilltopParamAndResult { pub fn new(background_image: DynamicImage, challenge_image: DynamicImage, ch_size: u32, top_n: usize) -> HilltopParamAndResult { HilltopParamAndResult { background_image, challenge_image, ch_size, top_n, avg_diff: 0, } } } struct XY { x: usize, y: usize, weight: u64, } impl XY { pub fn new() -> XY { XY { x: 0, y: 0, weight: 0, } } pub fn update(&mut self, x: usize, y: usize, weight: u64) { if weight > self.weight { self.x = x; self.y = y; self.weight = weight; } } } struct AggregateMountain { diff_data: Vec<Vec<u64>>, width: usize, height: usize, is_last: bool, next: Option<Box<AggregateMountain>>, } impl AggregateMountain { fn new(diff_data: Vec<Vec<u64>>, width: usize, height: usize) -> AggregateMountain { AggregateMountain { diff_data, width, height, is_last: false, next: None, } } pub fn fetch_top_point(&self) -> XY { if self.is_last { let mut xy = XY::new(); for i in 0..self.width { for j in 0..self.height { xy.update(i, j, self.diff_data[i][j]); } } return xy; } let next_xy = self.next.as_ref().unwrap().fetch_top_point(); let start_x = next_xy.x * 5; let end_x = min(next_xy.x * 5 + 4, self.width - 1); let start_y = next_xy.y * 5; let end_y = min(next_xy.y * 5 + 4, self.height - 1); let mut xy = XY::new(); for i in start_x..=end_x { for j in start_y..=end_y { xy.update(i, j, self.diff_data[i][j]); } } return xy; } pub fn invalid_rectangle(&mut self, left_top_x: usize, left_top_y: usize, right_bottom_x: usize, right_bottom_y: usize) { if self.is_last { return; } let next_start_x = left_top_x / 5; let mut next_start_y = left_top_y / 5; let mut next_end_x = (right_bottom_x + 4) / 5; let mut next_end_y = (right_bottom_y + 4) / 5; if left_top_x % 5 != 0 { if next_start_x < 1 { next_end_x = 0; } } if left_top_y % 5 != 0 { if next_start_y < 1 { next_start_y = 0; } // next_start_y = max(next_start_y - 1, 0); } if right_bottom_x % 5 != 0 { next_end_x = min(next_end_x + 1, self.next.as_ref().unwrap().width - 1); } if right_bottom_x % 5 != 0 { next_end_y = min(next_end_y + 1, self.next.as_ref().unwrap().height - 1); } // fill in next diff data for x in next_start_x..=next_end_x { for y in next_start_y..=next_end_y { let scan_start_x = x * 5; let scan_start_y = y * 5; let scan_end_x = min(scan_start_x + 4, self.width - 1); let scan_end_y = min(scan_start_y + 4, self.height - 1); // let center_x = (scan_start_x + scan_end_x) / 2; // let center_y = (scan_start_y + scan_end_y) / 2; let mut aggregate_diff = 0; for next_x in scan_start_x..=scan_end_x { for next_y in scan_start_y..=scan_end_y { aggregate_diff += self.diff_data[next_x][next_y]; } } self.next.as_mut().unwrap().diff_data[x][y] = aggregate_diff; } self.next.as_mut().unwrap().invalid_rectangle(next_start_x, next_start_y, next_end_x, next_end_y); } } pub fn gen_aggregate_mountain_mapping(&mut self) { if self.width < 5 || self.height < 5 { self.is_last = true; return; } let next_width = (self.width + 4) / 5; let next_height = (self.width + 4) / 5; let next_data = vec![vec![0; next_height]; next_width]; self.next = Option::from(Box::new(AggregateMountain::new(next_data, next_width, next_height))); self.next.as_mut().unwrap().gen_aggregate_mountain_mapping(); } } struct Rectangle { top_x: usize, top_y: usize, bottom_x: usize, bottom_y: usize, } impl Rectangle { pub fn rectangle_range(x: usize, y: usize, slice_size: usize, total_width: usize, total_height: usize) -> Rectangle { // println!("x = {}, y = {}, slice_size = {}, width = {}, height = {}", x, y, slice_size, total_width, total_height); let half_slice_size = slice_size / 2; let top_x = if x > half_slice_size { x - half_slice_size } else { 0 }; let top_y = if y > half_slice_size { y - half_slice_size } else { 0 }; let mut right_bottom_x = x + half_slice_size; let mut right_bottom_y = y + half_slice_size; if right_bottom_x >= total_width { right_bottom_x = total_width - 1; } if right_bottom_y >= total_height { right_bottom_y = total_height - 1; } Rectangle { top_x, top_y, bottom_x: right_bottom_x, bottom_y: right_bottom_y, } } } pub fn sqrt(x: usize) -> usize { let mut a: usize = 1; while a * a <= x as usize { a = a + 1; } return (a - 1) as usize; } fn adjust_center_point(top_xy: XY, mountain: &AggregateMountain, result: &HilltopParamAndResult, result_width: usize, result_height: usize, result_diff: &Vec<Vec<i32>>) -> XY { let points = Rectangle::rectangle_range(top_xy.x, top_xy.y, (result.ch_size * 2) as usize, result_width, result_height); let thumb_times = sqrt(result.ch_size as usize) as u32; let mut short_curt_width = result.ch_size / thumb_times; short_curt_width *= 2; let mut short_curt = vec![vec![0; short_curt_width as usize]; short_curt_width as usize]; for i in 0..short_curt_width { for j in 0..short_curt_width { let start_x = i * thumb_times + points.top_x as u32; let start_y = j * thumb_times + points.top_y as u32; let end_x = min(start_x + thumb_times - 1, (result_width - 1) as u32); let end_y = min(start_y + thumb_times - 1, (result_height - 1) as u32); let mut total_diff = 0u64; for x in start_x..=end_x { for y in start_y..=end_y { total_diff += result_diff[x as usize][y as usize] as u64; } } short_curt[i as usize][j as usize] = total_diff; } } let short_curt_mountain_width = short_curt_width / 2; let mut short_curt_xy = XY::new(); let mut short_curt_mountain = vec![vec![0; short_curt_mountain_width as usize]; short_curt_mountain_width as usize]; for i in 0..short_curt_mountain_width { for j in 0..short_curt_mountain_width { let center_x = i + short_curt_mountain_width / 2; let center_y = j + short_curt_mountain_width / 2; let rect = Rectangle::rectangle_range(center_x as usize, center_y as usize, short_curt_mountain_width as usize, short_curt_width as usize, short_curt_width as usize); let mut aggregate_diff = 0.0; for x in rect.top_x..=rect.bottom_x { for y in rect.top_y..=rect.bottom_y { let base = short_curt[x][y] as f64; let distance = (((x as f64 - center_x as f64) * (x as f64 - center_x as f64) + (y as f64 - center_y as f64) * (y as f64 - center_y as f64)) as f64).sqrt(); let distance_ratio = distance / (SQRT_2 * ((short_curt_mountain_width) / 2) as f64); if distance_ratio > 1.0 { continue; } let ratio = ((PI * distance_ratio).cos() + 1.0) / 2.0; aggregate_diff += base * base * base * ratio; } } short_curt_mountain[i as usize][j as usize] = aggregate_diff as usize; short_curt_xy.update(center_x as usize, center_y as usize, aggregate_diff as u64); } } // 在缩略图里面寻找最高点,之后再回放到原图进行 let real_start_x = short_curt_xy.x as usize * thumb_times as usize + points.top_x; let real_end_x = short_curt_xy.x * thumb_times as usize + thumb_times as usize + points.top_x; let real_start_y = short_curt_xy.y * thumb_times as usize + points.top_y; let real_end_y = short_curt_xy.y * thumb_times as usize + thumb_times as usize + points.top_y; let mut xy = XY::new(); for i in real_start_x..=real_end_x { for j in real_start_y..=real_end_y { let rect = Rectangle::rectangle_range(i, j, result.ch_size as usize, result_width, result_height); let mut aggregate_diff = 0.0; for x in rect.top_x..=rect.bottom_x { for y in rect.top_y..=rect.bottom_y { let distance = (((x as i64 - i as i64).pow(2) + (y as i64 - j as i64).pow(2)) as f64).sqrt(); let distance_ratio = distance / (SQRT_2 * (result.ch_size / 2) as f64); if distance_ratio > 1.0 { continue; } let ratio = ((PI * distance_ratio).cos() + 1.0) / 2.0; aggregate_diff += mountain.diff_data[x][y] as f64 * ratio; } } xy.update(i, j, aggregate_diff as u64); } } xy } pub fn find_top_n(mut result: HilltopParamAndResult) -> Vec<Point> { // 挑战图的宽和高 let width = result.challenge_image.width() as usize; let height = result.challenge_image.height() as usize; // 缩放底图,如果宽和高不一致的话 let bg_image = result.background_image.clone(); let cg_image = result.challenge_image.clone(); // 这里写法好像有点bug, 目前没解决, 就当图都一样大吧,不一样自己用open-cv处理一下, ^.^ // if (bg_image.width() != result.width) || (bg_image.height() != result.height) { // bg_image = bg_image.thumbnail(result.width, result.height); // } let mut total_diff = 0u64; let mut diff = vec![vec![0; height]; width]; let mut calculate_diff = vec![vec![0u64; height]; width]; // 计算背景图和挑战图的像素差 for i in 0..width { for j in 0..height { let rgb_diff = rgb_diff(cg_image.get_pixel(i as u32, j as u32), bg_image.get_pixel(i as u32, j as u32)); diff[i][j] = rgb_diff; calculate_diff[i][j] = rgb_diff as u64; total_diff += rgb_diff as u64; } } let avg_diff = total_diff as f64 / (width * height) as f64; let mut mountain = AggregateMountain::new(calculate_diff, width, height); mountain.gen_aggregate_mountain_mapping(); mountain.invalid_rectangle(0, 0, width - 1, height - 1); let mut ret = vec![]; result.avg_diff = avg_diff as u32; for i in 0..result.top_n { let mut top_xy = mountain.fetch_top_point(); top_xy = adjust_center_point(top_xy, &mountain, &result, width, height, &diff); let point = Point::new(top_xy.x, top_xy.y, top_xy.weight as usize); ret.push(point); if i < result.top_n - 1 { trip_aggregate_mountain(&mut mountain, top_xy, &result, width, height); } } ret } fn trip_aggregate_mountain(mountain: &mut AggregateMountain, top_xy: XY, result: &HilltopParamAndResult, result_width: usize, result_height: usize) { let start_x = max(top_xy.x - result.ch_size as usize / 2, 0); let end_x = max(top_xy.x + result.ch_size as usize / 2, (result_width - 1) as usize); let start_y = max(top_xy.y - result.ch_size as usize / 2, 0); let end_y = max(top_xy.y + result.ch_size as usize / 2, (result_height - 1) as usize); let mut max_diff = 0; for x in start_x..=end_x { for y in start_y..=end_y { if mountain.diff_data[x][y] > max_diff { max_diff = mountain.diff_data[x][y]; } } } for x in start_x..=end_x { for y in start_y..=end_y { let distance = ((x as i64 - top_xy.x as i64).pow(2) as f64 + (y as i64 - top_xy.y as i64).pow(2) as f64).sqrt(); let distance_ratio = distance / result.ch_size as f64; if distance_ratio > 1.0 { continue; } // y = 1- x*x / 2.25 权值衰减函数,为2次函数,要求命中坐标: (0,1) (1.5,0) // 当距离为0的时候,衰减权重为1,当距离为1.5的时候,衰减权重为0 // 当距离为1的时候, 衰减权重为:1- 1/2.25 = 0.55 mountain.diff_data[x][y] = (mountain.diff_data[x][y] as f64 - max_diff as f64 * (1.0 - distance_ratio * distance_ratio / 2.25)) as u64; // 这块逻辑我也没测试到走这块,有可能有特定的图可能会overflow吧,目前没测试到, 如果存usize的话, 这块是不会走的 // if mountain.diff_data[x][y] < 0 { // mountain.diff_data[x][y] = 0; // } } } mountain.invalid_rectangle(start_x, start_y, end_x, end_y); } #[cfg(test)] mod tests { use crate::image_hill_top_v2::{HilltopParamAndResult, find_top_n}; #[test] fn test() { let bg_image = image::open("./src/images/out_mask.png").unwrap(); let cg_image = image::open("./src/images/img.png").unwrap(); let result = HilltopParamAndResult::new(bg_image, cg_image, 25, 1); let result = find_top_n(result); println!("{}", result.len()); println!("{:?}", result); } }
然后,这个算法同样的,提供一个py调用接口, 具体代码如下:
#[pyfunction] pub fn top_n(bg_image: &PyString, cg_image: &PyString, ch_size: usize, top_n: usize) -> PyResult<Vec<Point>> { let target = decode(bg_image.to_string()).unwrap(); let bg_image = image::load_from_memory(&target).unwrap(); let target = decode(cg_image.to_string()).unwrap(); let cg_image = image::load_from_memory(&target).unwrap(); let result = HilltopParamAndResult::new(bg_image, cg_image, ch_size as u32, top_n); let result = x::find_top_n(result); return PyResult::Ok(result); } #[pymodule] fn image_magic(_py: Python, m: &PyModule) -> PyResult<()> { m.add_function(wrap_pyfunction!(demo_py_function, m)?)?; m.add_function(wrap_pyfunction!(avg_b64, m)?)?; m.add_function(wrap_pyfunction!(top_n, m)?)?; m.add_class::<Point>()?; // 这里需要这个class, 也要添加进来 Ok(()) }
到这里,实际上整个库的核心就实现完成了,编译打包之后,就可以实现文章开头的样例的效果了。
总结
借助Rust和PyO3, 就可以实现使用rust去编写Python的package,实际上,rust的能力远不止如此,rust也可以去直接生成c/c++/wasm/nodejs等调用的库,就使用Python的体验来说,实现起来还是比较容易的,感觉以后有哪些不太想用py直接写的某些库,就可以考虑这个方式来实现了,而且这个编译之后是二进制文件,相对来说源代码也更加安全,总的来说,这个思路是一个非常不错的体验, 最后感谢参考资料当中用到的代码和文章作者。
❝声明: 本文当中的代码仅仅作为学习参考使用, 主要目的是介绍用rust来实现Python的库, 文章代码不针对任何产品, 请大家合理使用本文代码,不要用做非法用途, 谢谢。