Slowness found when base64 image select and encode from database

前端 未结 3 1718
轮回少年
轮回少年 2020-11-22 11:25

I am working in ionic framework. Currently designing a posts page with text and images. User can post there data and image and all are secure.

So, i use base 64 enco

3条回答
  •  半阙折子戏
    2020-11-22 12:02

    Since it's just personal files, your could store them in S3.

    In order to be safe about file uploads, just check the file's mime type before uploading for whatever storage you choose.

    http://php.net/manual/en/function.mime-content-type.php

    just run a quick check on the uploaded file:

    $mime = mime_content_type($file_path);
    if($mime == 'image/jpeg') return true;
    

    no big deal!

    keeping files on the database is bad practise, it should be your last resource. S3 is great for many use cases, but it's expensive for high usages and local files should be used only for intranets and non-public available apps.

    In my opinion, go S3.

    Amazon's sdk is easy to use and you get a 1gb free storage for testing. You could also use your own server, just keep it out of your database.

    Solution for storing images on filesystem

    Let's say you have 100.000 users and each one of them has 10 pics. How do you handle storing it locally? Problem: Linux filesystem breaks after a few dozens of thousands images, therefore you should make the file structure avoid that

    Solution: Make the folder name be 'abs(userID/1000)*1000'/userID

    That way when you have the user with id 989787 it's images will be stored on the folder 989000/989787/img1.jpeg 989000/989787/img2.jpeg 989000/989787/img3.jpeg

    and there you have it, a way of storing images for a million users that doesn't break the unix filesystem.

    How about storage sizes?

    Last month I had to compress a 1.3 million jpegs for the e-commerce I work on. When uploading images, compress using imagick with lossless flags and 80% quality. That will remove the invisible pixels and optimize your storage. Since our images vary from 40x40 (thumbnails) to 1500x1500 (zoom images) we have an average of 700x700 images, times 1.3 million images which filled around 120GB of storage.

    So yeah, it's possible to store it all on your filesystem.

    When things start to get slow, you hire a CDN.

    How will that work?

    The CDN sits in front of your image server, whenever the CDN is requested for a file, if it doesn't find it in it's storage (cache miss) it will copy it from your image server. Later, when the CDN get's requested again, it will deliver the image from it's own cache.

    This way no code is needed to migrate to a CDN image deliver, all you will need to do is change the urls in your site and hire a CDN, the same works for a S3 bucket.

    It's not a cheap service, but it's waaaaay cheaper then cloudfront and when you get to the point of needing it, you can probably afford it.

提交回复
热议问题