Appendix E — Image processing with Convolutional Neural Networks - Pytorch

Author

phonchi

Published

May 1, 2023

Open In Colab


E.1 Setup

!apt-get install tree -qq
!pip install torchinfo -qq
!pip install pyyaml==5.1 -qq
import distutils.core
# Note: This is a faster way to install detectron2 in Colab, but it does not include all functionalities.
# See https://detectron2.readthedocs.io/tutorials/install.html for full installation instructions
!git clone 'https://github.com/facebookresearch/detectron2'
dist = distutils.core.run_setup("./detectron2/setup.py")
!pip install {' '.join([f"'{x}'" for x in dist.install_requires])} -qq
Selecting previously unselected package tree.
(Reading database ... 122518 files and directories currently installed.)
Preparing to unpack .../tree_1.8.0-1_amd64.deb ...
Unpacking tree (1.8.0-1) ...
Setting up tree (1.8.0-1) ...
Processing triggers for man-db (2.9.1-1) ...
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 274.2/274.2 kB 5.6 MB/s eta 0:00:00
  Preparing metadata (setup.py) ... done
  Building wheel for pyyaml (setup.py) ... done
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
flax 0.6.9 requires PyYAML>=5.4.1, but you have pyyaml 5.1 which is incompatible.
dask 2022.12.1 requires pyyaml>=5.3.1, but you have pyyaml 5.1 which is incompatible.
Cloning into 'detectron2'...
remote: Enumerating objects: 15022, done.
remote: Counting objects: 100% (47/47), done.
remote: Compressing objects: 100% (34/34), done.
remote: Total 15022 (delta 23), reused 31 (delta 13), pack-reused 14975
Receiving objects: 100% (15022/15022), 6.10 MiB | 26.49 MiB/s, done.
Resolving deltas: 100% (10886/10886), done.
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 50.2/50.2 kB 2.7 MB/s eta 0:00:00
  Preparing metadata (setup.py) ... done
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 79.5/79.5 kB 9.0 MB/s eta 0:00:00
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 154.5/154.5 kB 16.2 MB/s eta 0:00:00
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.7/1.7 MB 35.1 MB/s eta 0:00:00
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 117.0/117.0 kB 16.1 MB/s eta 0:00:00
  Preparing metadata (setup.py) ... done
  Building wheel for fvcore (setup.py) ... done
  Building wheel for antlr4-python3-runtime (setup.py) ... done
!pip install git+https://github.com/cleanlab/cleanvision.git -qq
!pip install cleanlab -qq
!pip install skorch -qq
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 296.5/296.5 kB 6.5 MB/s eta 0:00:00
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.4/3.4 MB 31.2 MB/s eta 0:00:00
  Building wheel for cleanvision (pyproject.toml) ... done
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 175.8/175.8 kB 2.2 MB/s eta 0:00:00
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 193.7/193.7 kB 5.0 MB/s eta 0:00:00

We need to restart the environment after installing.

# Python ≥3.7 is recommended
import sys
assert sys.version_info >= (3, 7)
import os
sys.path.insert(0, os.path.abspath('./detectron2'))
import gc

# Scikit-Learn ≥1.01 is recommended
from packaging import version
import sklearn
from sklearn.datasets import load_sample_image
from sklearn.datasets import load_sample_images
from sklearn.datasets import fetch_openml
from sklearn.model_selection import cross_val_predict
from sklearn.metrics import accuracy_score
assert version.parse(sklearn.__version__) >= version.parse("1.0.1")

# Pytorch related
import torch
import torch.nn as nn
import torch.optim as optim 
from torchvision import datasets, transforms
from torchvision.transforms.functional import to_pil_image
import torch.nn.functional as F
from torchinfo import summary
from fastai.vision.all import *

# Object detection
import detectron2
from detectron2.utils.logger import setup_logger
from detectron2 import model_zoo
from detectron2.engine import DefaultPredictor
from detectron2.config import get_cfg
from detectron2.utils.visualizer import Visualizer
from detectron2.data import MetadataCatalog, DatasetCatalog
from detectron2.structures import BoxMode
from google.colab.patches import cv2_imshow
from detectron2.engine import DefaultTrainer
from detectron2.utils.visualizer import ColorMode
from detectron2.evaluation import COCOEvaluator, inference_on_dataset
from detectron2.data import build_detection_test_loader

# Image augmentation
import albumentations as A
from albumentations.pytorch import ToTensorV2

# Data centric AI
from cleanvision.imagelab import Imagelab
from cleanlab.filter import find_label_issues
from skorch import NeuralNetClassifier

# Common imports
import numpy as np
import shutil
import pathlib
import resource
import tqdm
import copy
import json
import cv2
import random

# To plot pretty figures
%matplotlib inline
import matplotlib.pyplot as plt
import matplotlib as mpl
plt.rc('font', size=14)
plt.rc('axes', labelsize=14, titlesize=14)
plt.rc('legend', fontsize=14)
plt.rc('xtick', labelsize=10)
plt.rc('ytick', labelsize=10)

# to make this notebook's output stable across runs
np.random.seed(42)
torch.manual_seed(42)
<torch._C.Generator at 0x7f94d710f890>
if not torch.cuda.device_count():
    print("No GPU was detected. Neural nets can be very slow without a GPU.")
    if "google.colab" in sys.modules:
        print("Go to Runtime > Change runtime and select a GPU hardware "
              "accelerator.")
    if "kaggle_secrets" in sys.modules:
        print("Go to Settings > Accelerator and select GPU.")

A couple utility functions to plot grayscale and RGB images:

def plot_image(image):
    plt.imshow(image, cmap="gray", interpolation="nearest")
    plt.axis("off")

def plot_color_image(image):
    plt.imshow(image, interpolation="nearest")
    plt.axis("off")

def plot_examples(id_iter, nrows=1, ncols=1):
    for count, id in enumerate(id_iter):
        plt.subplot(nrows, ncols, count + 1)
        plt.imshow(X[id].reshape(28, 28), cmap="gray")
        plt.title(f"id: {id} \n label: {labels[id]}")
        plt.axis("off")

    plt.tight_layout(h_pad=2.0)

You can find useful kernels here https://setosa.io/ev/image-kernels/

E.2 Tackling Fashion MNIST With a CNN

Before delving into the code, you can go through https://poloclub.github.io/cnn-explainer/ to make sure you understand every piece of CNN.

Typical CNN architectures stack a few convolutional layers (each one generally followed by a ReLU layer), then a pooling layer, then another few convolutional layers (+ReLU), then another pooling layer, and so on. The image gets smaller and smaller as it progresses through the network, but it also typically gets deeper and deeper (i.e.,with more feature maps) thanks to the convolutional layers. At the top of the stack, a regular feedforward neural network is added, composed of a few fully connected layers (+ReLUs), and the final layer outputs the prediction (e.g., a softmax layer that outputs estimated class probabilities).

drawing

Here is how you can implement a simple CNN to tackle the fashion MNIST dataset

learn = None
model = None
gc.collect()
torch.cuda.empty_cache()
transform = transforms.Compose([transforms.ToTensor(), lambda x: x/255])

trainset = datasets.FashionMNIST(
    root="data",            
    train=True,             
    download=True,         
    transform=transform,  
)

testset = datasets.FashionMNIST(
    root="data",           
    train=False,           
    download=True,         
    transform=transform,
)

# Preparing for validaion test

trainset, validset = torch.utils.data.random_split(trainset, [55000, 5000])

# Data Loader
trainloader = torch.utils.data.DataLoader(trainset, batch_size=32, shuffle=True, num_workers=2)
validloader = torch.utils.data.DataLoader(validset, batch_size=32, shuffle=False, num_workers=2)
testloader = torch.utils.data.DataLoader(testset, batch_size=32, shuffle=False, num_workers=2)

len(trainset), len(validset), len(testset)
Downloading http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/train-images-idx3-ubyte.gz
Downloading http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/train-images-idx3-ubyte.gz to data/FashionMNIST/raw/train-images-idx3-ubyte.gz
100%|██████████| 26421880/26421880 [00:03<00:00, 8415136.41it/s] 
Extracting data/FashionMNIST/raw/train-images-idx3-ubyte.gz to data/FashionMNIST/raw

Downloading http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/train-labels-idx1-ubyte.gz
Downloading http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/train-labels-idx1-ubyte.gz to data/FashionMNIST/raw/train-labels-idx1-ubyte.gz
100%|██████████| 29515/29515 [00:00<00:00, 143845.01it/s]
Extracting data/FashionMNIST/raw/train-labels-idx1-ubyte.gz to data/FashionMNIST/raw

Downloading http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/t10k-images-idx3-ubyte.gz
Downloading http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/t10k-images-idx3-ubyte.gz to data/FashionMNIST/raw/t10k-images-idx3-ubyte.gz
100%|██████████| 4422102/4422102 [00:01<00:00, 2692092.56it/s]
Extracting data/FashionMNIST/raw/t10k-images-idx3-ubyte.gz to data/FashionMNIST/raw

Downloading http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/t10k-labels-idx1-ubyte.gz
Downloading http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/t10k-labels-idx1-ubyte.gz to data/FashionMNIST/raw/t10k-labels-idx1-ubyte.gz
100%|██████████| 5148/5148 [00:00<00:00, 18957223.00it/s]
Extracting data/FashionMNIST/raw/t10k-labels-idx1-ubyte.gz to data/FashionMNIST/raw
(55000, 5000, 10000)
model = torch.nn.Sequential(
    torch.nn.Conv2d(1, 32, 7, padding='same'),
    torch.nn.ReLU(),
    torch.nn.MaxPool2d(2),
    torch.nn.Conv2d(32, 64, 3, padding='same'),
    torch.nn.ReLU(),
    torch.nn.Conv2d(64, 64, 3, padding='same'),
    torch.nn.ReLU(),
    torch.nn.MaxPool2d(2),
    torch.nn.Conv2d(64, 128, 3, padding='same'),
    torch.nn.ReLU(),
    torch.nn.Conv2d(128, 128, 3, padding='same'),
    torch.nn.ReLU(),
    torch.nn.MaxPool2d(2),
    torch.nn.Flatten(),
    torch.nn.Linear(3*3*128, 64),
    torch.nn.ReLU(),
    torch.nn.Dropout(0.5),
    torch.nn.Linear(64, 32),
    torch.nn.ReLU(),
    torch.nn.Dropout(0.5),
    torch.nn.Linear(32, 10)
)
# Use He initialization for the convolutional layers and the linear layer
for m in model.modules():
    if isinstance(m, nn.Conv2d) or isinstance(m, nn.Linear):
        nn.init.kaiming_uniform_(m.weight)
summary(model, input_size=(32, 1, 28, 28))
==========================================================================================
Layer (type:depth-idx)                   Output Shape              Param #
==========================================================================================
Sequential                               [32, 10]                  --
├─Conv2d: 1-1                            [32, 32, 28, 28]          1,600
├─ReLU: 1-2                              [32, 32, 28, 28]          --
├─MaxPool2d: 1-3                         [32, 32, 14, 14]          --
├─Conv2d: 1-4                            [32, 64, 14, 14]          18,496
├─ReLU: 1-5                              [32, 64, 14, 14]          --
├─Conv2d: 1-6                            [32, 64, 14, 14]          36,928
├─ReLU: 1-7                              [32, 64, 14, 14]          --
├─MaxPool2d: 1-8                         [32, 64, 7, 7]            --
├─Conv2d: 1-9                            [32, 128, 7, 7]           73,856
├─ReLU: 1-10                             [32, 128, 7, 7]           --
├─Conv2d: 1-11                           [32, 128, 7, 7]           147,584
├─ReLU: 1-12                             [32, 128, 7, 7]           --
├─MaxPool2d: 1-13                        [32, 128, 3, 3]           --
├─Flatten: 1-14                          [32, 1152]                --
├─Linear: 1-15                           [32, 64]                  73,792
├─ReLU: 1-16                             [32, 64]                  --
├─Dropout: 1-17                          [32, 64]                  --
├─Linear: 1-18                           [32, 32]                  2,080
├─ReLU: 1-19                             [32, 32]                  --
├─Dropout: 1-20                          [32, 32]                  --
├─Linear: 1-21                           [32, 10]                  330
==========================================================================================
Total params: 354,666
Trainable params: 354,666
Non-trainable params: 0
Total mult-adds (M): 737.42
==========================================================================================
Input size (MB): 0.10
Forward/backward pass size (MB): 16.08
Params size (MB): 1.42
Estimated Total Size (MB): 17.60
==========================================================================================
data = DataLoaders(trainloader, validloader)
learn = Learner(data, model, loss_func=F.cross_entropy, opt_func=RMSProp, metrics=[accuracy])
learn.lr_find()
SuggestedLRs(valley=0.2089296132326126)

learn.fit_one_cycle(30, 0.001)
epoch train_loss valid_loss accuracy time
0 1.910867 1.753917 0.486200 00:21
1 1.303495 1.115846 0.652000 00:21
2 0.969552 0.697844 0.740800 00:23
3 0.735478 0.566994 0.796200 00:22
4 0.651329 0.497390 0.816000 00:21
5 0.575974 0.416111 0.851800 00:21
6 0.541723 0.407052 0.855200 00:22
7 0.485088 0.381411 0.866400 00:22
8 0.476140 0.380247 0.870800 00:21
9 0.435762 0.370856 0.875000 00:21
10 0.390848 0.378833 0.880400 00:22
11 0.371974 0.356836 0.883000 00:21
12 0.395236 0.327657 0.886200 00:21
13 0.363806 0.330452 0.890800 00:21
14 0.339303 0.357805 0.895600 00:22
15 0.348606 0.336609 0.893800 00:22
16 0.340379 0.316998 0.894600 00:21
17 0.363921 0.306676 0.895000 00:21
18 0.296538 0.350968 0.893000 00:22
19 0.295195 0.309254 0.902600 00:22
20 0.293125 0.312394 0.905400 00:22
21 0.278338 0.328402 0.907000 00:21
22 0.260811 0.313633 0.909600 00:22
23 0.254771 0.315433 0.908000 00:22
24 0.263927 0.326123 0.913400 00:22
25 0.219861 0.322429 0.909200 00:21
26 0.218416 0.326497 0.913800 00:22
27 0.199811 0.323420 0.912400 00:22
28 0.192666 0.327092 0.913200 00:22
29 0.191411 0.323766 0.913000 00:22
fastai_loss, fastai_accuracy = learn.validate(dl=testloader)
fastai_accuracy
0.9077000021934509

E.3 Training a convnet from scratch on a small dataset

Having to train an image-classification model using very little data is a common situation, which you’ll likely encounter in practice if you ever do computer vision in a professional context. A “few” samples can mean anywhere from a few hundred to a few tens of thousands of images. As a practical example, we’ll focus on classifying images as dogs or cats in a dataset containing 5,000 pictures of cats and dogs (2,500 cats, 2,500 dogs). We’ll use 2,000 pictures for training, 1,000 for validation, and 2,000 for testing.

In this section, we’ll review one basic strategy to tackle this problem: training a new model from scratch using what little data you have. We’ll start by naively training a small convnet on the 2,000 training samples, without any regularization, to set a baseline for what can be achieved. This will get us to a classification accuracy of about 70%. At that point, the main issue will be overfitting. Then we’ll introduce data augmentation, a powerful technique for mitigating overfitting in computer vision. By using data augmentation, we’ll improve the model to reach an accuracy of 80–85%.

E.3.1 The relevance of deep learning for small-data problems

What qualifies as “enough samples” to train a model is relative— relative to the size and depth of the model you’re trying to train, for starters. It isn’t possible to train a convnet to solve a complex problem with just a few tens of samples, but a few hundred can potentially suffice if the model is small and well regularized and the task is simple.

Because convnets learn local, translation-invariant features, they’re highly data-efficient on perceptual problems. Training a convnet from scratch on a very small image dataset will yield reasonable results despite a relative lack of data, without the need for any custom feature engineering. You’ll see this in action in this section.

E.3.2 Downloading the data

The Dogs vs. Cats dataset that we will use isn’t packaged with Keras. It was made available by Kaggle as part of a computer vision competition in late 2013, back when convnets weren’t mainstream. You can download the original dataset from www.kaggle.com/c/dogs-vs-cats/data.

But you can also use Kaggle API. First, you need to create a Kaggle API key and download it to your local machine. Just navigate to the Kaggle website in a web browser, log in, and go to the My Account page. In your account settings, you’ll find an API section. Clicking the Create New API Token button will generate a kaggle.json key file and will download it to your machine.

drawing

# Upload the API’s key JSON file to your Colab
# session by running the following code in a notebook cell:
from google.colab import files
files.upload()

Finally, create a ~/.kaggle folder, and copy the key file to it. As a security best practice, you should also make sure that the file is only readable by the current user, yourself:

!mkdir ~/.kaggle
!cp kaggle.json ~/.kaggle/
!chmod 600 ~/.kaggle/kaggle.json
# You can now download the data we’re about to use:
!kaggle competitions download -c dogs-vs-cats
Downloading dogs-vs-cats.zip to /content
 99% 805M/812M [00:04<00:00, 223MB/s]
100% 812M/812M [00:04<00:00, 177MB/s]

The first time you try to download the data, you may get a “403 Forbidden” error. That’s because you need to accept the terms associated with the dataset before you download it—you’ll have to go to www.kaggle.com/c/dogs-vs-cats/rules (while logged into your Kaggle account) and click the I Understand and Accept button. You only need to do this once.

!unzip -qq dogs-vs-cats.zip
!unzip -qq train.zip

The pictures in our dataset are medium-resolution color JPEGs. Unsurprisingly, the original dogs-versus-cats Kaggle competition, all the way back in 2013, was won by entrants who used convnets. The best entries achieved up to 95% accuracy. Even though we will train our models on less than 10% of the data that was available to the competitors, we will still get a resonable well performance.

This dataset contains 25,000 images of dogs and cats (12,500 from each class) and is 543 MB (compressed). After downloading and uncompressing the data, we’ll create a new dataset containing three subsets: a training set with 1,000 samples of each class, a validation set with 500 samples of each class, and a test set with 1,000 samples of each class. Why do this? Because many of the image datasets you’ll encounter in your career only contain a few thousand samples, not tens of thousands. Having more data available would make the problem easier, so it’s good practice to learn with a small dataset.

!tree train
串流輸出內容已截斷至最後 5000 行。
├── dog.5502.jpg
├── dog.5503.jpg
├── dog.5504.jpg
├── dog.5505.jpg
├── dog.5506.jpg
├── dog.5507.jpg
├── dog.5508.jpg
├── dog.5509.jpg
├── dog.550.jpg
├── dog.5510.jpg
├── dog.5511.jpg
├── dog.5512.jpg
├── dog.5513.jpg
├── dog.5514.jpg
├── dog.5515.jpg
├── dog.5516.jpg
├── dog.5517.jpg
├── dog.5518.jpg
├── dog.5519.jpg
├── dog.551.jpg
├── dog.5520.jpg
├── dog.5521.jpg
├── dog.5522.jpg
├── dog.5523.jpg
├── dog.5524.jpg
├── dog.5525.jpg
├── dog.5526.jpg
├── dog.5527.jpg
├── dog.5528.jpg
├── dog.5529.jpg
├── dog.552.jpg
├── dog.5530.jpg
├── dog.5531.jpg
├── dog.5532.jpg
├── dog.5533.jpg
├── dog.5534.jpg
├── dog.5535.jpg
├── dog.5536.jpg
├── dog.5537.jpg
├── dog.5538.jpg
├── dog.5539.jpg
├── dog.553.jpg
├── dog.5540.jpg
├── dog.5541.jpg
├── dog.5542.jpg
├── dog.5543.jpg
├── dog.5544.jpg
├── dog.5545.jpg
├── dog.5546.jpg
├── dog.5547.jpg
├── dog.5548.jpg
├── dog.5549.jpg
├── dog.554.jpg
├── dog.5550.jpg
├── dog.5551.jpg
├── dog.5552.jpg
├── dog.5553.jpg
├── dog.5554.jpg
├── dog.5555.jpg
├── dog.5556.jpg
├── dog.5557.jpg
├── dog.5558.jpg
├── dog.5559.jpg
├── dog.555.jpg
├── dog.5560.jpg
├── dog.5561.jpg
├── dog.5562.jpg
├── dog.5563.jpg
├── dog.5564.jpg
├── dog.5565.jpg
├── dog.5566.jpg
├── dog.5567.jpg
├── dog.5568.jpg
├── dog.5569.jpg
├── dog.556.jpg
├── dog.5570.jpg
├── dog.5571.jpg
├── dog.5572.jpg
├── dog.5573.jpg
├── dog.5574.jpg
├── dog.5575.jpg
├── dog.5576.jpg
├── dog.5577.jpg
├── dog.5578.jpg
├── dog.5579.jpg
├── dog.557.jpg
├── dog.5580.jpg
├── dog.5581.jpg
├── dog.5582.jpg
├── dog.5583.jpg
├── dog.5584.jpg
├── dog.5585.jpg
├── dog.5586.jpg
├── dog.5587.jpg
├── dog.5588.jpg
├── dog.5589.jpg
├── dog.558.jpg
├── dog.5590.jpg
├── dog.5591.jpg
├── dog.5592.jpg
├── dog.5593.jpg
├── dog.5594.jpg
├── dog.5595.jpg
├── dog.5596.jpg
├── dog.5597.jpg
├── dog.5598.jpg
├── dog.5599.jpg
├── dog.559.jpg
├── dog.55.jpg
├── dog.5600.jpg
├── dog.5601.jpg
├── dog.5602.jpg
├── dog.5603.jpg
├── dog.5604.jpg
├── dog.5605.jpg
├── dog.5606.jpg
├── dog.5607.jpg
├── dog.5608.jpg
├── dog.5609.jpg
├── dog.560.jpg
├── dog.5610.jpg
├── dog.5611.jpg
├── dog.5612.jpg
├── dog.5613.jpg
├── dog.5614.jpg
├── dog.5615.jpg
├── dog.5616.jpg
├── dog.5617.jpg
├── dog.5618.jpg
├── dog.5619.jpg
├── dog.561.jpg
├── dog.5620.jpg
├── dog.5621.jpg
├── dog.5622.jpg
├── dog.5623.jpg
├── dog.5624.jpg
├── dog.5625.jpg
├── dog.5626.jpg
├── dog.5627.jpg
├── dog.5628.jpg
├── dog.5629.jpg
├── dog.562.jpg
├── dog.5630.jpg
├── dog.5631.jpg
├── dog.5632.jpg
├── dog.5633.jpg
├── dog.5634.jpg
├── dog.5635.jpg
├── dog.5636.jpg
├── dog.5637.jpg
├── dog.5638.jpg
├── dog.5639.jpg
├── dog.563.jpg
├── dog.5640.jpg
├── dog.5641.jpg
├── dog.5642.jpg
├── dog.5643.jpg
├── dog.5644.jpg
├── dog.5645.jpg
├── dog.5646.jpg
├── dog.5647.jpg
├── dog.5648.jpg
├── dog.5649.jpg
├── dog.564.jpg
├── dog.5650.jpg
├── dog.5651.jpg
├── dog.5652.jpg
├── dog.5653.jpg
├── dog.5654.jpg
├── dog.5655.jpg
├── dog.5656.jpg
├── dog.5657.jpg
├── dog.5658.jpg
├── dog.5659.jpg
├── dog.565.jpg
├── dog.5660.jpg
├── dog.5661.jpg
├── dog.5662.jpg
├── dog.5663.jpg
├── dog.5664.jpg
├── dog.5665.jpg
├── dog.5666.jpg
├── dog.5667.jpg
├── dog.5668.jpg
├── dog.5669.jpg
├── dog.566.jpg
├── dog.5670.jpg
├── dog.5671.jpg
├── dog.5672.jpg
├── dog.5673.jpg
├── dog.5674.jpg
├── dog.5675.jpg
├── dog.5676.jpg
├── dog.5677.jpg
├── dog.5678.jpg
├── dog.5679.jpg
├── dog.567.jpg
├── dog.5680.jpg
├── dog.5681.jpg
├── dog.5682.jpg
├── dog.5683.jpg
├── dog.5684.jpg
├── dog.5685.jpg
├── dog.5686.jpg
├── dog.5687.jpg
├── dog.5688.jpg
├── dog.5689.jpg
├── dog.568.jpg
├── dog.5690.jpg
├── dog.5691.jpg
├── dog.5692.jpg
├── dog.5693.jpg
├── dog.5694.jpg
├── dog.5695.jpg
├── dog.5696.jpg
├── dog.5697.jpg
├── dog.5698.jpg
├── dog.5699.jpg
├── dog.569.jpg
├── dog.56.jpg
├── dog.5700.jpg
├── dog.5701.jpg
├── dog.5702.jpg
├── dog.5703.jpg
├── dog.5704.jpg
├── dog.5705.jpg
├── dog.5706.jpg
├── dog.5707.jpg
├── dog.5708.jpg
├── dog.5709.jpg
├── dog.570.jpg
├── dog.5710.jpg
├── dog.5711.jpg
├── dog.5712.jpg
├── dog.5713.jpg
├── dog.5714.jpg
├── dog.5715.jpg
├── dog.5716.jpg
├── dog.5717.jpg
├── dog.5718.jpg
├── dog.5719.jpg
├── dog.571.jpg
├── dog.5720.jpg
├── dog.5721.jpg
├── dog.5722.jpg
├── dog.5723.jpg
├── dog.5724.jpg
├── dog.5725.jpg
├── dog.5726.jpg
├── dog.5727.jpg
├── dog.5728.jpg
├── dog.5729.jpg
├── dog.572.jpg
├── dog.5730.jpg
├── dog.5731.jpg
├── dog.5732.jpg
├── dog.5733.jpg
├── dog.5734.jpg
├── dog.5735.jpg
├── dog.5736.jpg
├── dog.5737.jpg
├── dog.5738.jpg
├── dog.5739.jpg
├── dog.573.jpg
├── dog.5740.jpg
├── dog.5741.jpg
├── dog.5742.jpg
├── dog.5743.jpg
├── dog.5744.jpg
├── dog.5745.jpg
├── dog.5746.jpg
├── dog.5747.jpg
├── dog.5748.jpg
├── dog.5749.jpg
├── dog.574.jpg
├── dog.5750.jpg
├── dog.5751.jpg
├── dog.5752.jpg
├── dog.5753.jpg
├── dog.5754.jpg
├── dog.5755.jpg
├── dog.5756.jpg
├── dog.5757.jpg
├── dog.5758.jpg
├── dog.5759.jpg
├── dog.575.jpg
├── dog.5760.jpg
├── dog.5761.jpg
├── dog.5762.jpg
├── dog.5763.jpg
├── dog.5764.jpg
├── dog.5765.jpg
├── dog.5766.jpg
├── dog.5767.jpg
├── dog.5768.jpg
├── dog.5769.jpg
├── dog.576.jpg
├── dog.5770.jpg
├── dog.5771.jpg
├── dog.5772.jpg
├── dog.5773.jpg
├── dog.5774.jpg
├── dog.5775.jpg
├── dog.5776.jpg
├── dog.5777.jpg
├── dog.5778.jpg
├── dog.5779.jpg
├── dog.577.jpg
├── dog.5780.jpg
├── dog.5781.jpg
├── dog.5782.jpg
├── dog.5783.jpg
├── dog.5784.jpg
├── dog.5785.jpg
├── dog.5786.jpg
├── dog.5787.jpg
├── dog.5788.jpg
├── dog.5789.jpg
├── dog.578.jpg
├── dog.5790.jpg
├── dog.5791.jpg
├── dog.5792.jpg
├── dog.5793.jpg
├── dog.5794.jpg
├── dog.5795.jpg
├── dog.5796.jpg
├── dog.5797.jpg
├── dog.5798.jpg
├── dog.5799.jpg
├── dog.579.jpg
├── dog.57.jpg
├── dog.5800.jpg
├── dog.5801.jpg
├── dog.5802.jpg
├── dog.5803.jpg
├── dog.5804.jpg
├── dog.5805.jpg
├── dog.5806.jpg
├── dog.5807.jpg
├── dog.5808.jpg
├── dog.5809.jpg
├── dog.580.jpg
├── dog.5810.jpg
├── dog.5811.jpg
├── dog.5812.jpg
├── dog.5813.jpg
├── dog.5814.jpg
├── dog.5815.jpg
├── dog.5816.jpg
├── dog.5817.jpg
├── dog.5818.jpg
├── dog.5819.jpg
├── dog.581.jpg
├── dog.5820.jpg
├── dog.5821.jpg
├── dog.5822.jpg
├── dog.5823.jpg
├── dog.5824.jpg
├── dog.5825.jpg
├── dog.5826.jpg
├── dog.5827.jpg
├── dog.5828.jpg
├── dog.5829.jpg
├── dog.582.jpg
├── dog.5830.jpg
├── dog.5831.jpg
├── dog.5832.jpg
├── dog.5833.jpg
├── dog.5834.jpg
├── dog.5835.jpg
├── dog.5836.jpg
├── dog.5837.jpg
├── dog.5838.jpg
├── dog.5839.jpg
├── dog.583.jpg
├── dog.5840.jpg
├── dog.5841.jpg
├── dog.5842.jpg
├── dog.5843.jpg
├── dog.5844.jpg
├── dog.5845.jpg
├── dog.5846.jpg
├── dog.5847.jpg
├── dog.5848.jpg
├── dog.5849.jpg
├── dog.584.jpg
├── dog.5850.jpg
├── dog.5851.jpg
├── dog.5852.jpg
├── dog.5853.jpg
├── dog.5854.jpg
├── dog.5855.jpg
├── dog.5856.jpg
├── dog.5857.jpg
├── dog.5858.jpg
├── dog.5859.jpg
├── dog.585.jpg
├── dog.5860.jpg
├── dog.5861.jpg
├── dog.5862.jpg
├── dog.5863.jpg
├── dog.5864.jpg
├── dog.5865.jpg
├── dog.5866.jpg
├── dog.5867.jpg
├── dog.5868.jpg
├── dog.5869.jpg
├── dog.586.jpg
├── dog.5870.jpg
├── dog.5871.jpg
├── dog.5872.jpg
├── dog.5873.jpg
├── dog.5874.jpg
├── dog.5875.jpg
├── dog.5876.jpg
├── dog.5877.jpg
├── dog.5878.jpg
├── dog.5879.jpg
├── dog.587.jpg
├── dog.5880.jpg
├── dog.5881.jpg
├── dog.5882.jpg
├── dog.5883.jpg
├── dog.5884.jpg
├── dog.5885.jpg
├── dog.5886.jpg
├── dog.5887.jpg
├── dog.5888.jpg
├── dog.5889.jpg
├── dog.588.jpg
├── dog.5890.jpg
├── dog.5891.jpg
├── dog.5892.jpg
├── dog.5893.jpg
├── dog.5894.jpg
├── dog.5895.jpg
├── dog.5896.jpg
├── dog.5897.jpg
├── dog.5898.jpg
├── dog.5899.jpg
├── dog.589.jpg
├── dog.58.jpg
├── dog.5900.jpg
├── dog.5901.jpg
├── dog.5902.jpg
├── dog.5903.jpg
├── dog.5904.jpg
├── dog.5905.jpg
├── dog.5906.jpg
├── dog.5907.jpg
├── dog.5908.jpg
├── dog.5909.jpg
├── dog.590.jpg
├── dog.5910.jpg
├── dog.5911.jpg
├── dog.5912.jpg
├── dog.5913.jpg
├── dog.5914.jpg
├── dog.5915.jpg
├── dog.5916.jpg
├── dog.5917.jpg
├── dog.5918.jpg
├── dog.5919.jpg
├── dog.591.jpg
├── dog.5920.jpg
├── dog.5921.jpg
├── dog.5922.jpg
├── dog.5923.jpg
├── dog.5924.jpg
├── dog.5925.jpg
├── dog.5926.jpg
├── dog.5927.jpg
├── dog.5928.jpg
├── dog.5929.jpg
├── dog.592.jpg
├── dog.5930.jpg
├── dog.5931.jpg
├── dog.5932.jpg
├── dog.5933.jpg
├── dog.5934.jpg
├── dog.5935.jpg
├── dog.5936.jpg
├── dog.5937.jpg
├── dog.5938.jpg
├── dog.5939.jpg
├── dog.593.jpg
├── dog.5940.jpg
├── dog.5941.jpg
├── dog.5942.jpg
├── dog.5943.jpg
├── dog.5944.jpg
├── dog.5945.jpg
├── dog.5946.jpg
├── dog.5947.jpg
├── dog.5948.jpg
├── dog.5949.jpg
├── dog.594.jpg
├── dog.5950.jpg
├── dog.5951.jpg
├── dog.5952.jpg
├── dog.5953.jpg
├── dog.5954.jpg
├── dog.5955.jpg
├── dog.5956.jpg
├── dog.5957.jpg
├── dog.5958.jpg
├── dog.5959.jpg
├── dog.595.jpg
├── dog.5960.jpg
├── dog.5961.jpg
├── dog.5962.jpg
├── dog.5963.jpg
├── dog.5964.jpg
├── dog.5965.jpg
├── dog.5966.jpg
├── dog.5967.jpg
├── dog.5968.jpg
├── dog.5969.jpg
├── dog.596.jpg
├── dog.5970.jpg
├── dog.5971.jpg
├── dog.5972.jpg
├── dog.5973.jpg
├── dog.5974.jpg
├── dog.5975.jpg
├── dog.5976.jpg
├── dog.5977.jpg
├── dog.5978.jpg
├── dog.5979.jpg
├── dog.597.jpg
├── dog.5980.jpg
├── dog.5981.jpg
├── dog.5982.jpg
├── dog.5983.jpg
├── dog.5984.jpg
├── dog.5985.jpg
├── dog.5986.jpg
├── dog.5987.jpg
├── dog.5988.jpg
├── dog.5989.jpg
├── dog.598.jpg
├── dog.5990.jpg
├── dog.5991.jpg
├── dog.5992.jpg
├── dog.5993.jpg
├── dog.5994.jpg
├── dog.5995.jpg
├── dog.5996.jpg
├── dog.5997.jpg
├── dog.5998.jpg
├── dog.5999.jpg
├── dog.599.jpg
├── dog.59.jpg
├── dog.5.jpg
├── dog.6000.jpg
├── dog.6001.jpg
├── dog.6002.jpg
├── dog.6003.jpg
├── dog.6004.jpg
├── dog.6005.jpg
├── dog.6006.jpg
├── dog.6007.jpg
├── dog.6008.jpg
├── dog.6009.jpg
├── dog.600.jpg
├── dog.6010.jpg
├── dog.6011.jpg
├── dog.6012.jpg
├── dog.6013.jpg
├── dog.6014.jpg
├── dog.6015.jpg
├── dog.6016.jpg
├── dog.6017.jpg
├── dog.6018.jpg
├── dog.6019.jpg
├── dog.601.jpg
├── dog.6020.jpg
├── dog.6021.jpg
├── dog.6022.jpg
├── dog.6023.jpg
├── dog.6024.jpg
├── dog.6025.jpg
├── dog.6026.jpg
├── dog.6027.jpg
├── dog.6028.jpg
├── dog.6029.jpg
├── dog.602.jpg
├── dog.6030.jpg
├── dog.6031.jpg
├── dog.6032.jpg
├── dog.6033.jpg
├── dog.6034.jpg
├── dog.6035.jpg
├── dog.6036.jpg
├── dog.6037.jpg
├── dog.6038.jpg
├── dog.6039.jpg
├── dog.603.jpg
├── dog.6040.jpg
├── dog.6041.jpg
├── dog.6042.jpg
├── dog.6043.jpg
├── dog.6044.jpg
├── dog.6045.jpg
├── dog.6046.jpg
├── dog.6047.jpg
├── dog.6048.jpg
├── dog.6049.jpg
├── dog.604.jpg
├── dog.6050.jpg
├── dog.6051.jpg
├── dog.6052.jpg
├── dog.6053.jpg
├── dog.6054.jpg
├── dog.6055.jpg
├── dog.6056.jpg
├── dog.6057.jpg
├── dog.6058.jpg
├── dog.6059.jpg
├── dog.605.jpg
├── dog.6060.jpg
├── dog.6061.jpg
├── dog.6062.jpg
├── dog.6063.jpg
├── dog.6064.jpg
├── dog.6065.jpg
├── dog.6066.jpg
├── dog.6067.jpg
├── dog.6068.jpg
├── dog.6069.jpg
├── dog.606.jpg
├── dog.6070.jpg
├── dog.6071.jpg
├── dog.6072.jpg
├── dog.6073.jpg
├── dog.6074.jpg
├── dog.6075.jpg
├── dog.6076.jpg
├── dog.6077.jpg
├── dog.6078.jpg
├── dog.6079.jpg
├── dog.607.jpg
├── dog.6080.jpg
├── dog.6081.jpg
├── dog.6082.jpg
├── dog.6083.jpg
├── dog.6084.jpg
├── dog.6085.jpg
├── dog.6086.jpg
├── dog.6087.jpg
├── dog.6088.jpg
├── dog.6089.jpg
├── dog.608.jpg
├── dog.6090.jpg
├── dog.6091.jpg
├── dog.6092.jpg
├── dog.6093.jpg
├── dog.6094.jpg
├── dog.6095.jpg
├── dog.6096.jpg
├── dog.6097.jpg
├── dog.6098.jpg
├── dog.6099.jpg
├── dog.609.jpg
├── dog.60.jpg
├── dog.6100.jpg
├── dog.6101.jpg
├── dog.6102.jpg
├── dog.6103.jpg
├── dog.6104.jpg
├── dog.6105.jpg
├── dog.6106.jpg
├── dog.6107.jpg
├── dog.6108.jpg
├── dog.6109.jpg
├── dog.610.jpg
├── dog.6110.jpg
├── dog.6111.jpg
├── dog.6112.jpg
├── dog.6113.jpg
├── dog.6114.jpg
├── dog.6115.jpg
├── dog.6116.jpg
├── dog.6117.jpg
├── dog.6118.jpg
├── dog.6119.jpg
├── dog.611.jpg
├── dog.6120.jpg
├── dog.6121.jpg
├── dog.6122.jpg
├── dog.6123.jpg
├── dog.6124.jpg
├── dog.6125.jpg
├── dog.6126.jpg
├── dog.6127.jpg
├── dog.6128.jpg
├── dog.6129.jpg
├── dog.612.jpg
├── dog.6130.jpg
├── dog.6131.jpg
├── dog.6132.jpg
├── dog.6133.jpg
├── dog.6134.jpg
├── dog.6135.jpg
├── dog.6136.jpg
├── dog.6137.jpg
├── dog.6138.jpg
├── dog.6139.jpg
├── dog.613.jpg
├── dog.6140.jpg
├── dog.6141.jpg
├── dog.6142.jpg
├── dog.6143.jpg
├── dog.6144.jpg
├── dog.6145.jpg
├── dog.6146.jpg
├── dog.6147.jpg
├── dog.6148.jpg
├── dog.6149.jpg
├── dog.614.jpg
├── dog.6150.jpg
├── dog.6151.jpg
├── dog.6152.jpg
├── dog.6153.jpg
├── dog.6154.jpg
├── dog.6155.jpg
├── dog.6156.jpg
├── dog.6157.jpg
├── dog.6158.jpg
├── dog.6159.jpg
├── dog.615.jpg
├── dog.6160.jpg
├── dog.6161.jpg
├── dog.6162.jpg
├── dog.6163.jpg
├── dog.6164.jpg
├── dog.6165.jpg
├── dog.6166.jpg
├── dog.6167.jpg
├── dog.6168.jpg
├── dog.6169.jpg
├── dog.616.jpg
├── dog.6170.jpg
├── dog.6171.jpg
├── dog.6172.jpg
├── dog.6173.jpg
├── dog.6174.jpg
├── dog.6175.jpg
├── dog.6176.jpg
├── dog.6177.jpg
├── dog.6178.jpg
├── dog.6179.jpg
├── dog.617.jpg
├── dog.6180.jpg
├── dog.6181.jpg
├── dog.6182.jpg
├── dog.6183.jpg
├── dog.6184.jpg
├── dog.6185.jpg
├── dog.6186.jpg
├── dog.6187.jpg
├── dog.6188.jpg
├── dog.6189.jpg
├── dog.618.jpg
├── dog.6190.jpg
├── dog.6191.jpg
├── dog.6192.jpg
├── dog.6193.jpg
├── dog.6194.jpg
├── dog.6195.jpg
├── dog.6196.jpg
├── dog.6197.jpg
├── dog.6198.jpg
├── dog.6199.jpg
├── dog.619.jpg
├── dog.61.jpg
├── dog.6200.jpg
├── dog.6201.jpg
├── dog.6202.jpg
├── dog.6203.jpg
├── dog.6204.jpg
├── dog.6205.jpg
├── dog.6206.jpg
├── dog.6207.jpg
├── dog.6208.jpg
├── dog.6209.jpg
├── dog.620.jpg
├── dog.6210.jpg
├── dog.6211.jpg
├── dog.6212.jpg
├── dog.6213.jpg
├── dog.6214.jpg
├── dog.6215.jpg
├── dog.6216.jpg
├── dog.6217.jpg
├── dog.6218.jpg
├── dog.6219.jpg
├── dog.621.jpg
├── dog.6220.jpg
├── dog.6221.jpg
├── dog.6222.jpg
├── dog.6223.jpg
├── dog.6224.jpg
├── dog.6225.jpg
├── dog.6226.jpg
├── dog.6227.jpg
├── dog.6228.jpg
├── dog.6229.jpg
├── dog.622.jpg
├── dog.6230.jpg
├── dog.6231.jpg
├── dog.6232.jpg
├── dog.6233.jpg
├── dog.6234.jpg
├── dog.6235.jpg
├── dog.6236.jpg
├── dog.6237.jpg
├── dog.6238.jpg
├── dog.6239.jpg
├── dog.623.jpg
├── dog.6240.jpg
├── dog.6241.jpg
├── dog.6242.jpg
├── dog.6243.jpg
├── dog.6244.jpg
├── dog.6245.jpg
├── dog.6246.jpg
├── dog.6247.jpg
├── dog.6248.jpg
├── dog.6249.jpg
├── dog.624.jpg
├── dog.6250.jpg
├── dog.6251.jpg
├── dog.6252.jpg
├── dog.6253.jpg
├── dog.6254.jpg
├── dog.6255.jpg
├── dog.6256.jpg
├── dog.6257.jpg
├── dog.6258.jpg
├── dog.6259.jpg
├── dog.625.jpg
├── dog.6260.jpg
├── dog.6261.jpg
├── dog.6262.jpg
├── dog.6263.jpg
├── dog.6264.jpg
├── dog.6265.jpg
├── dog.6266.jpg
├── dog.6267.jpg
├── dog.6268.jpg
├── dog.6269.jpg
├── dog.626.jpg
├── dog.6270.jpg
├── dog.6271.jpg
├── dog.6272.jpg
├── dog.6273.jpg
├── dog.6274.jpg
├── dog.6275.jpg
├── dog.6276.jpg
├── dog.6277.jpg
├── dog.6278.jpg
├── dog.6279.jpg
├── dog.627.jpg
├── dog.6280.jpg
├── dog.6281.jpg
├── dog.6282.jpg
├── dog.6283.jpg
├── dog.6284.jpg
├── dog.6285.jpg
├── dog.6286.jpg
├── dog.6287.jpg
├── dog.6288.jpg
├── dog.6289.jpg
├── dog.628.jpg
├── dog.6290.jpg
├── dog.6291.jpg
├── dog.6292.jpg
├── dog.6293.jpg
├── dog.6294.jpg
├── dog.6295.jpg
├── dog.6296.jpg
├── dog.6297.jpg
├── dog.6298.jpg
├── dog.6299.jpg
├── dog.629.jpg
├── dog.62.jpg
├── dog.6300.jpg
├── dog.6301.jpg
├── dog.6302.jpg
├── dog.6303.jpg
├── dog.6304.jpg
├── dog.6305.jpg
├── dog.6306.jpg
├── dog.6307.jpg
├── dog.6308.jpg
├── dog.6309.jpg
├── dog.630.jpg
├── dog.6310.jpg
├── dog.6311.jpg
├── dog.6312.jpg
├── dog.6313.jpg
├── dog.6314.jpg
├── dog.6315.jpg
├── dog.6316.jpg
├── dog.6317.jpg
├── dog.6318.jpg
├── dog.6319.jpg
├── dog.631.jpg
├── dog.6320.jpg
├── dog.6321.jpg
├── dog.6322.jpg
├── dog.6323.jpg
├── dog.6324.jpg
├── dog.6325.jpg
├── dog.6326.jpg
├── dog.6327.jpg
├── dog.6328.jpg
├── dog.6329.jpg
├── dog.632.jpg
├── dog.6330.jpg
├── dog.6331.jpg
├── dog.6332.jpg
├── dog.6333.jpg
├── dog.6334.jpg
├── dog.6335.jpg
├── dog.6336.jpg
├── dog.6337.jpg
├── dog.6338.jpg
├── dog.6339.jpg
├── dog.633.jpg
├── dog.6340.jpg
├── dog.6341.jpg
├── dog.6342.jpg
├── dog.6343.jpg
├── dog.6344.jpg
├── dog.6345.jpg
├── dog.6346.jpg
├── dog.6347.jpg
├── dog.6348.jpg
├── dog.6349.jpg
├── dog.634.jpg
├── dog.6350.jpg
├── dog.6351.jpg
├── dog.6352.jpg
├── dog.6353.jpg
├── dog.6354.jpg
├── dog.6355.jpg
├── dog.6356.jpg
├── dog.6357.jpg
├── dog.6358.jpg
├── dog.6359.jpg
├── dog.635.jpg
├── dog.6360.jpg
├── dog.6361.jpg
├── dog.6362.jpg
├── dog.6363.jpg
├── dog.6364.jpg
├── dog.6365.jpg
├── dog.6366.jpg
├── dog.6367.jpg
├── dog.6368.jpg
├── dog.6369.jpg
├── dog.636.jpg
├── dog.6370.jpg
├── dog.6371.jpg
├── dog.6372.jpg
├── dog.6373.jpg
├── dog.6374.jpg
├── dog.6375.jpg
├── dog.6376.jpg
├── dog.6377.jpg
├── dog.6378.jpg
├── dog.6379.jpg
├── dog.637.jpg
├── dog.6380.jpg
├── dog.6381.jpg
├── dog.6382.jpg
├── dog.6383.jpg
├── dog.6384.jpg
├── dog.6385.jpg
├── dog.6386.jpg
├── dog.6387.jpg
├── dog.6388.jpg
├── dog.6389.jpg
├── dog.638.jpg
├── dog.6390.jpg
├── dog.6391.jpg
├── dog.6392.jpg
├── dog.6393.jpg
├── dog.6394.jpg
├── dog.6395.jpg
├── dog.6396.jpg
├── dog.6397.jpg
├── dog.6398.jpg
├── dog.6399.jpg
├── dog.639.jpg
├── dog.63.jpg
├── dog.6400.jpg
├── dog.6401.jpg
├── dog.6402.jpg
├── dog.6403.jpg
├── dog.6404.jpg
├── dog.6405.jpg
├── dog.6406.jpg
├── dog.6407.jpg
├── dog.6408.jpg
├── dog.6409.jpg
├── dog.640.jpg
├── dog.6410.jpg
├── dog.6411.jpg
├── dog.6412.jpg
├── dog.6413.jpg
├── dog.6414.jpg
├── dog.6415.jpg
├── dog.6416.jpg
├── dog.6417.jpg
├── dog.6418.jpg
├── dog.6419.jpg
├── dog.641.jpg
├── dog.6420.jpg
├── dog.6421.jpg
├── dog.6422.jpg
├── dog.6423.jpg
├── dog.6424.jpg
├── dog.6425.jpg
├── dog.6426.jpg
├── dog.6427.jpg
├── dog.6428.jpg
├── dog.6429.jpg
├── dog.642.jpg
├── dog.6430.jpg
├── dog.6431.jpg
├── dog.6432.jpg
├── dog.6433.jpg
├── dog.6434.jpg
├── dog.6435.jpg
├── dog.6436.jpg
├── dog.6437.jpg
├── dog.6438.jpg
├── dog.6439.jpg
├── dog.643.jpg
├── dog.6440.jpg
├── dog.6441.jpg
├── dog.6442.jpg
├── dog.6443.jpg
├── dog.6444.jpg
├── dog.6445.jpg
├── dog.6446.jpg
├── dog.6447.jpg
├── dog.6448.jpg
├── dog.6449.jpg
├── dog.644.jpg
├── dog.6450.jpg
├── dog.6451.jpg
├── dog.6452.jpg
├── dog.6453.jpg
├── dog.6454.jpg
├── dog.6455.jpg
├── dog.6456.jpg
├── dog.6457.jpg
├── dog.6458.jpg
├── dog.6459.jpg
├── dog.645.jpg
├── dog.6460.jpg
├── dog.6461.jpg
├── dog.6462.jpg
├── dog.6463.jpg
├── dog.6464.jpg
├── dog.6465.jpg
├── dog.6466.jpg
├── dog.6467.jpg
├── dog.6468.jpg
├── dog.6469.jpg
├── dog.646.jpg
├── dog.6470.jpg
├── dog.6471.jpg
├── dog.6472.jpg
├── dog.6473.jpg
├── dog.6474.jpg
├── dog.6475.jpg
├── dog.6476.jpg
├── dog.6477.jpg
├── dog.6478.jpg
├── dog.6479.jpg
├── dog.647.jpg
├── dog.6480.jpg
├── dog.6481.jpg
├── dog.6482.jpg
├── dog.6483.jpg
├── dog.6484.jpg
├── dog.6485.jpg
├── dog.6486.jpg
├── dog.6487.jpg
├── dog.6488.jpg
├── dog.6489.jpg
├── dog.648.jpg
├── dog.6490.jpg
├── dog.6491.jpg
├── dog.6492.jpg
├── dog.6493.jpg
├── dog.6494.jpg
├── dog.6495.jpg
├── dog.6496.jpg
├── dog.6497.jpg
├── dog.6498.jpg
├── dog.6499.jpg
├── dog.649.jpg
├── dog.64.jpg
├── dog.6500.jpg
├── dog.6501.jpg
├── dog.6502.jpg
├── dog.6503.jpg
├── dog.6504.jpg
├── dog.6505.jpg
├── dog.6506.jpg
├── dog.6507.jpg
├── dog.6508.jpg
├── dog.6509.jpg
├── dog.650.jpg
├── dog.6510.jpg
├── dog.6511.jpg
├── dog.6512.jpg
├── dog.6513.jpg
├── dog.6514.jpg
├── dog.6515.jpg
├── dog.6516.jpg
├── dog.6517.jpg
├── dog.6518.jpg
├── dog.6519.jpg
├── dog.651.jpg
├── dog.6520.jpg
├── dog.6521.jpg
├── dog.6522.jpg
├── dog.6523.jpg
├── dog.6524.jpg
├── dog.6525.jpg
├── dog.6526.jpg
├── dog.6527.jpg
├── dog.6528.jpg
├── dog.6529.jpg
├── dog.652.jpg
├── dog.6530.jpg
├── dog.6531.jpg
├── dog.6532.jpg
├── dog.6533.jpg
├── dog.6534.jpg
├── dog.6535.jpg
├── dog.6536.jpg
├── dog.6537.jpg
├── dog.6538.jpg
├── dog.6539.jpg
├── dog.653.jpg
├── dog.6540.jpg
├── dog.6541.jpg
├── dog.6542.jpg
├── dog.6543.jpg
├── dog.6544.jpg
├── dog.6545.jpg
├── dog.6546.jpg
├── dog.6547.jpg
├── dog.6548.jpg
├── dog.6549.jpg
├── dog.654.jpg
├── dog.6550.jpg
├── dog.6551.jpg
├── dog.6552.jpg
├── dog.6553.jpg
├── dog.6554.jpg
├── dog.6555.jpg
├── dog.6556.jpg
├── dog.6557.jpg
├── dog.6558.jpg
├── dog.6559.jpg
├── dog.655.jpg
├── dog.6560.jpg
├── dog.6561.jpg
├── dog.6562.jpg
├── dog.6563.jpg
├── dog.6564.jpg
├── dog.6565.jpg
├── dog.6566.jpg
├── dog.6567.jpg
├── dog.6568.jpg
├── dog.6569.jpg
├── dog.656.jpg
├── dog.6570.jpg
├── dog.6571.jpg
├── dog.6572.jpg
├── dog.6573.jpg
├── dog.6574.jpg
├── dog.6575.jpg
├── dog.6576.jpg
├── dog.6577.jpg
├── dog.6578.jpg
├── dog.6579.jpg
├── dog.657.jpg
├── dog.6580.jpg
├── dog.6581.jpg
├── dog.6582.jpg
├── dog.6583.jpg
├── dog.6584.jpg
├── dog.6585.jpg
├── dog.6586.jpg
├── dog.6587.jpg
├── dog.6588.jpg
├── dog.6589.jpg
├── dog.658.jpg
├── dog.6590.jpg
├── dog.6591.jpg
├── dog.6592.jpg
├── dog.6593.jpg
├── dog.6594.jpg
├── dog.6595.jpg
├── dog.6596.jpg
├── dog.6597.jpg
├── dog.6598.jpg
├── dog.6599.jpg
├── dog.659.jpg
├── dog.65.jpg
├── dog.6600.jpg
├── dog.6601.jpg
├── dog.6602.jpg
├── dog.6603.jpg
├── dog.6604.jpg
├── dog.6605.jpg
├── dog.6606.jpg
├── dog.6607.jpg
├── dog.6608.jpg
├── dog.6609.jpg
├── dog.660.jpg
├── dog.6610.jpg
├── dog.6611.jpg
├── dog.6612.jpg
├── dog.6613.jpg
├── dog.6614.jpg
├── dog.6615.jpg
├── dog.6616.jpg
├── dog.6617.jpg
├── dog.6618.jpg
├── dog.6619.jpg
├── dog.661.jpg
├── dog.6620.jpg
├── dog.6621.jpg
├── dog.6622.jpg
├── dog.6623.jpg
├── dog.6624.jpg
├── dog.6625.jpg
├── dog.6626.jpg
├── dog.6627.jpg
├── dog.6628.jpg
├── dog.6629.jpg
├── dog.662.jpg
├── dog.6630.jpg
├── dog.6631.jpg
├── dog.6632.jpg
├── dog.6633.jpg
├── dog.6634.jpg
├── dog.6635.jpg
├── dog.6636.jpg
├── dog.6637.jpg
├── dog.6638.jpg
├── dog.6639.jpg
├── dog.663.jpg
├── dog.6640.jpg
├── dog.6641.jpg
├── dog.6642.jpg
├── dog.6643.jpg
├── dog.6644.jpg
├── dog.6645.jpg
├── dog.6646.jpg
├── dog.6647.jpg
├── dog.6648.jpg
├── dog.6649.jpg
├── dog.664.jpg
├── dog.6650.jpg
├── dog.6651.jpg
├── dog.6652.jpg
├── dog.6653.jpg
├── dog.6654.jpg
├── dog.6655.jpg
├── dog.6656.jpg
├── dog.6657.jpg
├── dog.6658.jpg
├── dog.6659.jpg
├── dog.665.jpg
├── dog.6660.jpg
├── dog.6661.jpg
├── dog.6662.jpg
├── dog.6663.jpg
├── dog.6664.jpg
├── dog.6665.jpg
├── dog.6666.jpg
├── dog.6667.jpg
├── dog.6668.jpg
├── dog.6669.jpg
├── dog.666.jpg
├── dog.6670.jpg
├── dog.6671.jpg
├── dog.6672.jpg
├── dog.6673.jpg
├── dog.6674.jpg
├── dog.6675.jpg
├── dog.6676.jpg
├── dog.6677.jpg
├── dog.6678.jpg
├── dog.6679.jpg
├── dog.667.jpg
├── dog.6680.jpg
├── dog.6681.jpg
├── dog.6682.jpg
├── dog.6683.jpg
├── dog.6684.jpg
├── dog.6685.jpg
├── dog.6686.jpg
├── dog.6687.jpg
├── dog.6688.jpg
├── dog.6689.jpg
├── dog.668.jpg
├── dog.6690.jpg
├── dog.6691.jpg
├── dog.6692.jpg
├── dog.6693.jpg
├── dog.6694.jpg
├── dog.6695.jpg
├── dog.6696.jpg
├── dog.6697.jpg
├── dog.6698.jpg
├── dog.6699.jpg
├── dog.669.jpg
├── dog.66.jpg
├── dog.6700.jpg
├── dog.6701.jpg
├── dog.6702.jpg
├── dog.6703.jpg
├── dog.6704.jpg
├── dog.6705.jpg
├── dog.6706.jpg
├── dog.6707.jpg
├── dog.6708.jpg
├── dog.6709.jpg
├── dog.670.jpg
├── dog.6710.jpg
├── dog.6711.jpg
├── dog.6712.jpg
├── dog.6713.jpg
├── dog.6714.jpg
├── dog.6715.jpg
├── dog.6716.jpg
├── dog.6717.jpg
├── dog.6718.jpg
├── dog.6719.jpg
├── dog.671.jpg
├── dog.6720.jpg
├── dog.6721.jpg
├── dog.6722.jpg
├── dog.6723.jpg
├── dog.6724.jpg
├── dog.6725.jpg
├── dog.6726.jpg
├── dog.6727.jpg
├── dog.6728.jpg
├── dog.6729.jpg
├── dog.672.jpg
├── dog.6730.jpg
├── dog.6731.jpg
├── dog.6732.jpg
├── dog.6733.jpg
├── dog.6734.jpg
├── dog.6735.jpg
├── dog.6736.jpg
├── dog.6737.jpg
├── dog.6738.jpg
├── dog.6739.jpg
├── dog.673.jpg
├── dog.6740.jpg
├── dog.6741.jpg
├── dog.6742.jpg
├── dog.6743.jpg
├── dog.6744.jpg
├── dog.6745.jpg
├── dog.6746.jpg
├── dog.6747.jpg
├── dog.6748.jpg
├── dog.6749.jpg
├── dog.674.jpg
├── dog.6750.jpg
├── dog.6751.jpg
├── dog.6752.jpg
├── dog.6753.jpg
├── dog.6754.jpg
├── dog.6755.jpg
├── dog.6756.jpg
├── dog.6757.jpg
├── dog.6758.jpg
├── dog.6759.jpg
├── dog.675.jpg
├── dog.6760.jpg
├── dog.6761.jpg
├── dog.6762.jpg
├── dog.6763.jpg
├── dog.6764.jpg
├── dog.6765.jpg
├── dog.6766.jpg
├── dog.6767.jpg
├── dog.6768.jpg
├── dog.6769.jpg
├── dog.676.jpg
├── dog.6770.jpg
├── dog.6771.jpg
├── dog.6772.jpg
├── dog.6773.jpg
├── dog.6774.jpg
├── dog.6775.jpg
├── dog.6776.jpg
├── dog.6777.jpg
├── dog.6778.jpg
├── dog.6779.jpg
├── dog.677.jpg
├── dog.6780.jpg
├── dog.6781.jpg
├── dog.6782.jpg
├── dog.6783.jpg
├── dog.6784.jpg
├── dog.6785.jpg
├── dog.6786.jpg
├── dog.6787.jpg
├── dog.6788.jpg
├── dog.6789.jpg
├── dog.678.jpg
├── dog.6790.jpg
├── dog.6791.jpg
├── dog.6792.jpg
├── dog.6793.jpg
├── dog.6794.jpg
├── dog.6795.jpg
├── dog.6796.jpg
├── dog.6797.jpg
├── dog.6798.jpg
├── dog.6799.jpg
├── dog.679.jpg
├── dog.67.jpg
├── dog.6800.jpg
├── dog.6801.jpg
├── dog.6802.jpg
├── dog.6803.jpg
├── dog.6804.jpg
├── dog.6805.jpg
├── dog.6806.jpg
├── dog.6807.jpg
├── dog.6808.jpg
├── dog.6809.jpg
├── dog.680.jpg
├── dog.6810.jpg
├── dog.6811.jpg
├── dog.6812.jpg
├── dog.6813.jpg
├── dog.6814.jpg
├── dog.6815.jpg
├── dog.6816.jpg
├── dog.6817.jpg
├── dog.6818.jpg
├── dog.6819.jpg
├── dog.681.jpg
├── dog.6820.jpg
├── dog.6821.jpg
├── dog.6822.jpg
├── dog.6823.jpg
├── dog.6824.jpg
├── dog.6825.jpg
├── dog.6826.jpg
├── dog.6827.jpg
├── dog.6828.jpg
├── dog.6829.jpg
├── dog.682.jpg
├── dog.6830.jpg
├── dog.6831.jpg
├── dog.6832.jpg
├── dog.6833.jpg
├── dog.6834.jpg
├── dog.6835.jpg
├── dog.6836.jpg
├── dog.6837.jpg
├── dog.6838.jpg
├── dog.6839.jpg
├── dog.683.jpg
├── dog.6840.jpg
├── dog.6841.jpg
├── dog.6842.jpg
├── dog.6843.jpg
├── dog.6844.jpg
├── dog.6845.jpg
├── dog.6846.jpg
├── dog.6847.jpg
├── dog.6848.jpg
├── dog.6849.jpg
├── dog.684.jpg
├── dog.6850.jpg
├── dog.6851.jpg
├── dog.6852.jpg
├── dog.6853.jpg
├── dog.6854.jpg
├── dog.6855.jpg
├── dog.6856.jpg
├── dog.6857.jpg
├── dog.6858.jpg
├── dog.6859.jpg
├── dog.685.jpg
├── dog.6860.jpg
├── dog.6861.jpg
├── dog.6862.jpg
├── dog.6863.jpg
├── dog.6864.jpg
├── dog.6865.jpg
├── dog.6866.jpg
├── dog.6867.jpg
├── dog.6868.jpg
├── dog.6869.jpg
├── dog.686.jpg
├── dog.6870.jpg
├── dog.6871.jpg
├── dog.6872.jpg
├── dog.6873.jpg
├── dog.6874.jpg
├── dog.6875.jpg
├── dog.6876.jpg
├── dog.6877.jpg
├── dog.6878.jpg
├── dog.6879.jpg
├── dog.687.jpg
├── dog.6880.jpg
├── dog.6881.jpg
├── dog.6882.jpg
├── dog.6883.jpg
├── dog.6884.jpg
├── dog.6885.jpg
├── dog.6886.jpg
├── dog.6887.jpg
├── dog.6888.jpg
├── dog.6889.jpg
├── dog.688.jpg
├── dog.6890.jpg
├── dog.6891.jpg
├── dog.6892.jpg
├── dog.6893.jpg
├── dog.6894.jpg
├── dog.6895.jpg
├── dog.6896.jpg
├── dog.6897.jpg
├── dog.6898.jpg
├── dog.6899.jpg
├── dog.689.jpg
├── dog.68.jpg
├── dog.6900.jpg
├── dog.6901.jpg
├── dog.6902.jpg
├── dog.6903.jpg
├── dog.6904.jpg
├── dog.6905.jpg
├── dog.6906.jpg
├── dog.6907.jpg
├── dog.6908.jpg
├── dog.6909.jpg
├── dog.690.jpg
├── dog.6910.jpg
├── dog.6911.jpg
├── dog.6912.jpg
├── dog.6913.jpg
├── dog.6914.jpg
├── dog.6915.jpg
├── dog.6916.jpg
├── dog.6917.jpg
├── dog.6918.jpg
├── dog.6919.jpg
├── dog.691.jpg
├── dog.6920.jpg
├── dog.6921.jpg
├── dog.6922.jpg
├── dog.6923.jpg
├── dog.6924.jpg
├── dog.6925.jpg
├── dog.6926.jpg
├── dog.6927.jpg
├── dog.6928.jpg
├── dog.6929.jpg
├── dog.692.jpg
├── dog.6930.jpg
├── dog.6931.jpg
├── dog.6932.jpg
├── dog.6933.jpg
├── dog.6934.jpg
├── dog.6935.jpg
├── dog.6936.jpg
├── dog.6937.jpg
├── dog.6938.jpg
├── dog.6939.jpg
├── dog.693.jpg
├── dog.6940.jpg
├── dog.6941.jpg
├── dog.6942.jpg
├── dog.6943.jpg
├── dog.6944.jpg
├── dog.6945.jpg
├── dog.6946.jpg
├── dog.6947.jpg
├── dog.6948.jpg
├── dog.6949.jpg
├── dog.694.jpg
├── dog.6950.jpg
├── dog.6951.jpg
├── dog.6952.jpg
├── dog.6953.jpg
├── dog.6954.jpg
├── dog.6955.jpg
├── dog.6956.jpg
├── dog.6957.jpg
├── dog.6958.jpg
├── dog.6959.jpg
├── dog.695.jpg
├── dog.6960.jpg
├── dog.6961.jpg
├── dog.6962.jpg
├── dog.6963.jpg
├── dog.6964.jpg
├── dog.6965.jpg
├── dog.6966.jpg
├── dog.6967.jpg
├── dog.6968.jpg
├── dog.6969.jpg
├── dog.696.jpg
├── dog.6970.jpg
├── dog.6971.jpg
├── dog.6972.jpg
├── dog.6973.jpg
├── dog.6974.jpg
├── dog.6975.jpg
├── dog.6976.jpg
├── dog.6977.jpg
├── dog.6978.jpg
├── dog.6979.jpg
├── dog.697.jpg
├── dog.6980.jpg
├── dog.6981.jpg
├── dog.6982.jpg
├── dog.6983.jpg
├── dog.6984.jpg
├── dog.6985.jpg
├── dog.6986.jpg
├── dog.6987.jpg
├── dog.6988.jpg
├── dog.6989.jpg
├── dog.698.jpg
├── dog.6990.jpg
├── dog.6991.jpg
├── dog.6992.jpg
├── dog.6993.jpg
├── dog.6994.jpg
├── dog.6995.jpg
├── dog.6996.jpg
├── dog.6997.jpg
├── dog.6998.jpg
├── dog.6999.jpg
├── dog.699.jpg
├── dog.69.jpg
├── dog.6.jpg
├── dog.7000.jpg
├── dog.7001.jpg
├── dog.7002.jpg
├── dog.7003.jpg
├── dog.7004.jpg
├── dog.7005.jpg
├── dog.7006.jpg
├── dog.7007.jpg
├── dog.7008.jpg
├── dog.7009.jpg
├── dog.700.jpg
├── dog.7010.jpg
├── dog.7011.jpg
├── dog.7012.jpg
├── dog.7013.jpg
├── dog.7014.jpg
├── dog.7015.jpg
├── dog.7016.jpg
├── dog.7017.jpg
├── dog.7018.jpg
├── dog.7019.jpg
├── dog.701.jpg
├── dog.7020.jpg
├── dog.7021.jpg
├── dog.7022.jpg
├── dog.7023.jpg
├── dog.7024.jpg
├── dog.7025.jpg
├── dog.7026.jpg
├── dog.7027.jpg
├── dog.7028.jpg
├── dog.7029.jpg
├── dog.702.jpg
├── dog.7030.jpg
├── dog.7031.jpg
├── dog.7032.jpg
├── dog.7033.jpg
├── dog.7034.jpg
├── dog.7035.jpg
├── dog.7036.jpg
├── dog.7037.jpg
├── dog.7038.jpg
├── dog.7039.jpg
├── dog.703.jpg
├── dog.7040.jpg
├── dog.7041.jpg
├── dog.7042.jpg
├── dog.7043.jpg
├── dog.7044.jpg
├── dog.7045.jpg
├── dog.7046.jpg
├── dog.7047.jpg
├── dog.7048.jpg
├── dog.7049.jpg
├── dog.704.jpg
├── dog.7050.jpg
├── dog.7051.jpg
├── dog.7052.jpg
├── dog.7053.jpg
├── dog.7054.jpg
├── dog.7055.jpg
├── dog.7056.jpg
├── dog.7057.jpg
├── dog.7058.jpg
├── dog.7059.jpg
├── dog.705.jpg
├── dog.7060.jpg
├── dog.7061.jpg
├── dog.7062.jpg
├── dog.7063.jpg
├── dog.7064.jpg
├── dog.7065.jpg
├── dog.7066.jpg
├── dog.7067.jpg
├── dog.7068.jpg
├── dog.7069.jpg
├── dog.706.jpg
├── dog.7070.jpg
├── dog.7071.jpg
├── dog.7072.jpg
├── dog.7073.jpg
├── dog.7074.jpg
├── dog.7075.jpg
├── dog.7076.jpg
├── dog.7077.jpg
├── dog.7078.jpg
├── dog.7079.jpg
├── dog.707.jpg
├── dog.7080.jpg
├── dog.7081.jpg
├── dog.7082.jpg
├── dog.7083.jpg
├── dog.7084.jpg
├── dog.7085.jpg
├── dog.7086.jpg
├── dog.7087.jpg
├── dog.7088.jpg
├── dog.7089.jpg
├── dog.708.jpg
├── dog.7090.jpg
├── dog.7091.jpg
├── dog.7092.jpg
├── dog.7093.jpg
├── dog.7094.jpg
├── dog.7095.jpg
├── dog.7096.jpg
├── dog.7097.jpg
├── dog.7098.jpg
├── dog.7099.jpg
├── dog.709.jpg
├── dog.70.jpg
├── dog.7100.jpg
├── dog.7101.jpg
├── dog.7102.jpg
├── dog.7103.jpg
├── dog.7104.jpg
├── dog.7105.jpg
├── dog.7106.jpg
├── dog.7107.jpg
├── dog.7108.jpg
├── dog.7109.jpg
├── dog.710.jpg
├── dog.7110.jpg
├── dog.7111.jpg
├── dog.7112.jpg
├── dog.7113.jpg
├── dog.7114.jpg
├── dog.7115.jpg
├── dog.7116.jpg
├── dog.7117.jpg
├── dog.7118.jpg
├── dog.7119.jpg
├── dog.711.jpg
├── dog.7120.jpg
├── dog.7121.jpg
├── dog.7122.jpg
├── dog.7123.jpg
├── dog.7124.jpg
├── dog.7125.jpg
├── dog.7126.jpg
├── dog.7127.jpg
├── dog.7128.jpg
├── dog.7129.jpg
├── dog.712.jpg
├── dog.7130.jpg
├── dog.7131.jpg
├── dog.7132.jpg
├── dog.7133.jpg
├── dog.7134.jpg
├── dog.7135.jpg
├── dog.7136.jpg
├── dog.7137.jpg
├── dog.7138.jpg
├── dog.7139.jpg
├── dog.713.jpg
├── dog.7140.jpg
├── dog.7141.jpg
├── dog.7142.jpg
├── dog.7143.jpg
├── dog.7144.jpg
├── dog.7145.jpg
├── dog.7146.jpg
├── dog.7147.jpg
├── dog.7148.jpg
├── dog.7149.jpg
├── dog.714.jpg
├── dog.7150.jpg
├── dog.7151.jpg
├── dog.7152.jpg
├── dog.7153.jpg
├── dog.7154.jpg
├── dog.7155.jpg
├── dog.7156.jpg
├── dog.7157.jpg
├── dog.7158.jpg
├── dog.7159.jpg
├── dog.715.jpg
├── dog.7160.jpg
├── dog.7161.jpg
├── dog.7162.jpg
├── dog.7163.jpg
├── dog.7164.jpg
├── dog.7165.jpg
├── dog.7166.jpg
├── dog.7167.jpg
├── dog.7168.jpg
├── dog.7169.jpg
├── dog.716.jpg
├── dog.7170.jpg
├── dog.7171.jpg
├── dog.7172.jpg
├── dog.7173.jpg
├── dog.7174.jpg
├── dog.7175.jpg
├── dog.7176.jpg
├── dog.7177.jpg
├── dog.7178.jpg
├── dog.7179.jpg
├── dog.717.jpg
├── dog.7180.jpg
├── dog.7181.jpg
├── dog.7182.jpg
├── dog.7183.jpg
├── dog.7184.jpg
├── dog.7185.jpg
├── dog.7186.jpg
├── dog.7187.jpg
├── dog.7188.jpg
├── dog.7189.jpg
├── dog.718.jpg
├── dog.7190.jpg
├── dog.7191.jpg
├── dog.7192.jpg
├── dog.7193.jpg
├── dog.7194.jpg
├── dog.7195.jpg
├── dog.7196.jpg
├── dog.7197.jpg
├── dog.7198.jpg
├── dog.7199.jpg
├── dog.719.jpg
├── dog.71.jpg
├── dog.7200.jpg
├── dog.7201.jpg
├── dog.7202.jpg
├── dog.7203.jpg
├── dog.7204.jpg
├── dog.7205.jpg
├── dog.7206.jpg
├── dog.7207.jpg
├── dog.7208.jpg
├── dog.7209.jpg
├── dog.720.jpg
├── dog.7210.jpg
├── dog.7211.jpg
├── dog.7212.jpg
├── dog.7213.jpg
├── dog.7214.jpg
├── dog.7215.jpg
├── dog.7216.jpg
├── dog.7217.jpg
├── dog.7218.jpg
├── dog.7219.jpg
├── dog.721.jpg
├── dog.7220.jpg
├── dog.7221.jpg
├── dog.7222.jpg
├── dog.7223.jpg
├── dog.7224.jpg
├── dog.7225.jpg
├── dog.7226.jpg
├── dog.7227.jpg
├── dog.7228.jpg
├── dog.7229.jpg
├── dog.722.jpg
├── dog.7230.jpg
├── dog.7231.jpg
├── dog.7232.jpg
├── dog.7233.jpg
├── dog.7234.jpg
├── dog.7235.jpg
├── dog.7236.jpg
├── dog.7237.jpg
├── dog.7238.jpg
├── dog.7239.jpg
├── dog.723.jpg
├── dog.7240.jpg
├── dog.7241.jpg
├── dog.7242.jpg
├── dog.7243.jpg
├── dog.7244.jpg
├── dog.7245.jpg
├── dog.7246.jpg
├── dog.7247.jpg
├── dog.7248.jpg
├── dog.7249.jpg
├── dog.724.jpg
├── dog.7250.jpg
├── dog.7251.jpg
├── dog.7252.jpg
├── dog.7253.jpg
├── dog.7254.jpg
├── dog.7255.jpg
├── dog.7256.jpg
├── dog.7257.jpg
├── dog.7258.jpg
├── dog.7259.jpg
├── dog.725.jpg
├── dog.7260.jpg
├── dog.7261.jpg
├── dog.7262.jpg
├── dog.7263.jpg
├── dog.7264.jpg
├── dog.7265.jpg
├── dog.7266.jpg
├── dog.7267.jpg
├── dog.7268.jpg
├── dog.7269.jpg
├── dog.726.jpg
├── dog.7270.jpg
├── dog.7271.jpg
├── dog.7272.jpg
├── dog.7273.jpg
├── dog.7274.jpg
├── dog.7275.jpg
├── dog.7276.jpg
├── dog.7277.jpg
├── dog.7278.jpg
├── dog.7279.jpg
├── dog.727.jpg
├── dog.7280.jpg
├── dog.7281.jpg
├── dog.7282.jpg
├── dog.7283.jpg
├── dog.7284.jpg
├── dog.7285.jpg
├── dog.7286.jpg
├── dog.7287.jpg
├── dog.7288.jpg
├── dog.7289.jpg
├── dog.728.jpg
├── dog.7290.jpg
├── dog.7291.jpg
├── dog.7292.jpg
├── dog.7293.jpg
├── dog.7294.jpg
├── dog.7295.jpg
├── dog.7296.jpg
├── dog.7297.jpg
├── dog.7298.jpg
├── dog.7299.jpg
├── dog.729.jpg
├── dog.72.jpg
├── dog.7300.jpg
├── dog.7301.jpg
├── dog.7302.jpg
├── dog.7303.jpg
├── dog.7304.jpg
├── dog.7305.jpg
├── dog.7306.jpg
├── dog.7307.jpg
├── dog.7308.jpg
├── dog.7309.jpg
├── dog.730.jpg
├── dog.7310.jpg
├── dog.7311.jpg
├── dog.7312.jpg
├── dog.7313.jpg
├── dog.7314.jpg
├── dog.7315.jpg
├── dog.7316.jpg
├── dog.7317.jpg
├── dog.7318.jpg
├── dog.7319.jpg
├── dog.731.jpg
├── dog.7320.jpg
├── dog.7321.jpg
├── dog.7322.jpg
├── dog.7323.jpg
├── dog.7324.jpg
├── dog.7325.jpg
├── dog.7326.jpg
├── dog.7327.jpg
├── dog.7328.jpg
├── dog.7329.jpg
├── dog.732.jpg
├── dog.7330.jpg
├── dog.7331.jpg
├── dog.7332.jpg
├── dog.7333.jpg
├── dog.7334.jpg
├── dog.7335.jpg
├── dog.7336.jpg
├── dog.7337.jpg
├── dog.7338.jpg
├── dog.7339.jpg
├── dog.733.jpg
├── dog.7340.jpg
├── dog.7341.jpg
├── dog.7342.jpg
├── dog.7343.jpg
├── dog.7344.jpg
├── dog.7345.jpg
├── dog.7346.jpg
├── dog.7347.jpg
├── dog.7348.jpg
├── dog.7349.jpg
├── dog.734.jpg
├── dog.7350.jpg
├── dog.7351.jpg
├── dog.7352.jpg
├── dog.7353.jpg
├── dog.7354.jpg
├── dog.7355.jpg
├── dog.7356.jpg
├── dog.7357.jpg
├── dog.7358.jpg
├── dog.7359.jpg
├── dog.735.jpg
├── dog.7360.jpg
├── dog.7361.jpg
├── dog.7362.jpg
├── dog.7363.jpg
├── dog.7364.jpg
├── dog.7365.jpg
├── dog.7366.jpg
├── dog.7367.jpg
├── dog.7368.jpg
├── dog.7369.jpg
├── dog.736.jpg
├── dog.7370.jpg
├── dog.7371.jpg
├── dog.7372.jpg
├── dog.7373.jpg
├── dog.7374.jpg
├── dog.7375.jpg
├── dog.7376.jpg
├── dog.7377.jpg
├── dog.7378.jpg
├── dog.7379.jpg
├── dog.737.jpg
├── dog.7380.jpg
├── dog.7381.jpg
├── dog.7382.jpg
├── dog.7383.jpg
├── dog.7384.jpg
├── dog.7385.jpg
├── dog.7386.jpg
├── dog.7387.jpg
├── dog.7388.jpg
├── dog.7389.jpg
├── dog.738.jpg
├── dog.7390.jpg
├── dog.7391.jpg
├── dog.7392.jpg
├── dog.7393.jpg
├── dog.7394.jpg
├── dog.7395.jpg
├── dog.7396.jpg
├── dog.7397.jpg
├── dog.7398.jpg
├── dog.7399.jpg
├── dog.739.jpg
├── dog.73.jpg
├── dog.7400.jpg
├── dog.7401.jpg
├── dog.7402.jpg
├── dog.7403.jpg
├── dog.7404.jpg
├── dog.7405.jpg
├── dog.7406.jpg
├── dog.7407.jpg
├── dog.7408.jpg
├── dog.7409.jpg
├── dog.740.jpg
├── dog.7410.jpg
├── dog.7411.jpg
├── dog.7412.jpg
├── dog.7413.jpg
├── dog.7414.jpg
├── dog.7415.jpg
├── dog.7416.jpg
├── dog.7417.jpg
├── dog.7418.jpg
├── dog.7419.jpg
├── dog.741.jpg
├── dog.7420.jpg
├── dog.7421.jpg
├── dog.7422.jpg
├── dog.7423.jpg
├── dog.7424.jpg
├── dog.7425.jpg
├── dog.7426.jpg
├── dog.7427.jpg
├── dog.7428.jpg
├── dog.7429.jpg
├── dog.742.jpg
├── dog.7430.jpg
├── dog.7431.jpg
├── dog.7432.jpg
├── dog.7433.jpg
├── dog.7434.jpg
├── dog.7435.jpg
├── dog.7436.jpg
├── dog.7437.jpg
├── dog.7438.jpg
├── dog.7439.jpg
├── dog.743.jpg
├── dog.7440.jpg
├── dog.7441.jpg
├── dog.7442.jpg
├── dog.7443.jpg
├── dog.7444.jpg
├── dog.7445.jpg
├── dog.7446.jpg
├── dog.7447.jpg
├── dog.7448.jpg
├── dog.7449.jpg
├── dog.744.jpg
├── dog.7450.jpg
├── dog.7451.jpg
├── dog.7452.jpg
├── dog.7453.jpg
├── dog.7454.jpg
├── dog.7455.jpg
├── dog.7456.jpg
├── dog.7457.jpg
├── dog.7458.jpg
├── dog.7459.jpg
├── dog.745.jpg
├── dog.7460.jpg
├── dog.7461.jpg
├── dog.7462.jpg
├── dog.7463.jpg
├── dog.7464.jpg
├── dog.7465.jpg
├── dog.7466.jpg
├── dog.7467.jpg
├── dog.7468.jpg
├── dog.7469.jpg
├── dog.746.jpg
├── dog.7470.jpg
├── dog.7471.jpg
├── dog.7472.jpg
├── dog.7473.jpg
├── dog.7474.jpg
├── dog.7475.jpg
├── dog.7476.jpg
├── dog.7477.jpg
├── dog.7478.jpg
├── dog.7479.jpg
├── dog.747.jpg
├── dog.7480.jpg
├── dog.7481.jpg
├── dog.7482.jpg
├── dog.7483.jpg
├── dog.7484.jpg
├── dog.7485.jpg
├── dog.7486.jpg
├── dog.7487.jpg
├── dog.7488.jpg
├── dog.7489.jpg
├── dog.748.jpg
├── dog.7490.jpg
├── dog.7491.jpg
├── dog.7492.jpg
├── dog.7493.jpg
├── dog.7494.jpg
├── dog.7495.jpg
├── dog.7496.jpg
├── dog.7497.jpg
├── dog.7498.jpg
├── dog.7499.jpg
├── dog.749.jpg
├── dog.74.jpg
├── dog.7500.jpg
├── dog.7501.jpg
├── dog.7502.jpg
├── dog.7503.jpg
├── dog.7504.jpg
├── dog.7505.jpg
├── dog.7506.jpg
├── dog.7507.jpg
├── dog.7508.jpg
├── dog.7509.jpg
├── dog.750.jpg
├── dog.7510.jpg
├── dog.7511.jpg
├── dog.7512.jpg
├── dog.7513.jpg
├── dog.7514.jpg
├── dog.7515.jpg
├── dog.7516.jpg
├── dog.7517.jpg
├── dog.7518.jpg
├── dog.7519.jpg
├── dog.751.jpg
├── dog.7520.jpg
├── dog.7521.jpg
├── dog.7522.jpg
├── dog.7523.jpg
├── dog.7524.jpg
├── dog.7525.jpg
├── dog.7526.jpg
├── dog.7527.jpg
├── dog.7528.jpg
├── dog.7529.jpg
├── dog.752.jpg
├── dog.7530.jpg
├── dog.7531.jpg
├── dog.7532.jpg
├── dog.7533.jpg
├── dog.7534.jpg
├── dog.7535.jpg
├── dog.7536.jpg
├── dog.7537.jpg
├── dog.7538.jpg
├── dog.7539.jpg
├── dog.753.jpg
├── dog.7540.jpg
├── dog.7541.jpg
├── dog.7542.jpg
├── dog.7543.jpg
├── dog.7544.jpg
├── dog.7545.jpg
├── dog.7546.jpg
├── dog.7547.jpg
├── dog.7548.jpg
├── dog.7549.jpg
├── dog.754.jpg
├── dog.7550.jpg
├── dog.7551.jpg
├── dog.7552.jpg
├── dog.7553.jpg
├── dog.7554.jpg
├── dog.7555.jpg
├── dog.7556.jpg
├── dog.7557.jpg
├── dog.7558.jpg
├── dog.7559.jpg
├── dog.755.jpg
├── dog.7560.jpg
├── dog.7561.jpg
├── dog.7562.jpg
├── dog.7563.jpg
├── dog.7564.jpg
├── dog.7565.jpg
├── dog.7566.jpg
├── dog.7567.jpg
├── dog.7568.jpg
├── dog.7569.jpg
├── dog.756.jpg
├── dog.7570.jpg
├── dog.7571.jpg
├── dog.7572.jpg
├── dog.7573.jpg
├── dog.7574.jpg
├── dog.7575.jpg
├── dog.7576.jpg
├── dog.7577.jpg
├── dog.7578.jpg
├── dog.7579.jpg
├── dog.757.jpg
├── dog.7580.jpg
├── dog.7581.jpg
├── dog.7582.jpg
├── dog.7583.jpg
├── dog.7584.jpg
├── dog.7585.jpg
├── dog.7586.jpg
├── dog.7587.jpg
├── dog.7588.jpg
├── dog.7589.jpg
├── dog.758.jpg
├── dog.7590.jpg
├── dog.7591.jpg
├── dog.7592.jpg
├── dog.7593.jpg
├── dog.7594.jpg
├── dog.7595.jpg
├── dog.7596.jpg
├── dog.7597.jpg
├── dog.7598.jpg
├── dog.7599.jpg
├── dog.759.jpg
├── dog.75.jpg
├── dog.7600.jpg
├── dog.7601.jpg
├── dog.7602.jpg
├── dog.7603.jpg
├── dog.7604.jpg
├── dog.7605.jpg
├── dog.7606.jpg
├── dog.7607.jpg
├── dog.7608.jpg
├── dog.7609.jpg
├── dog.760.jpg
├── dog.7610.jpg
├── dog.7611.jpg
├── dog.7612.jpg
├── dog.7613.jpg
├── dog.7614.jpg
├── dog.7615.jpg
├── dog.7616.jpg
├── dog.7617.jpg
├── dog.7618.jpg
├── dog.7619.jpg
├── dog.761.jpg
├── dog.7620.jpg
├── dog.7621.jpg
├── dog.7622.jpg
├── dog.7623.jpg
├── dog.7624.jpg
├── dog.7625.jpg
├── dog.7626.jpg
├── dog.7627.jpg
├── dog.7628.jpg
├── dog.7629.jpg
├── dog.762.jpg
├── dog.7630.jpg
├── dog.7631.jpg
├── dog.7632.jpg
├── dog.7633.jpg
├── dog.7634.jpg
├── dog.7635.jpg
├── dog.7636.jpg
├── dog.7637.jpg
├── dog.7638.jpg
├── dog.7639.jpg
├── dog.763.jpg
├── dog.7640.jpg
├── dog.7641.jpg
├── dog.7642.jpg
├── dog.7643.jpg
├── dog.7644.jpg
├── dog.7645.jpg
├── dog.7646.jpg
├── dog.7647.jpg
├── dog.7648.jpg
├── dog.7649.jpg
├── dog.764.jpg
├── dog.7650.jpg
├── dog.7651.jpg
├── dog.7652.jpg
├── dog.7653.jpg
├── dog.7654.jpg
├── dog.7655.jpg
├── dog.7656.jpg
├── dog.7657.jpg
├── dog.7658.jpg
├── dog.7659.jpg
├── dog.765.jpg
├── dog.7660.jpg
├── dog.7661.jpg
├── dog.7662.jpg
├── dog.7663.jpg
├── dog.7664.jpg
├── dog.7665.jpg
├── dog.7666.jpg
├── dog.7667.jpg
├── dog.7668.jpg
├── dog.7669.jpg
├── dog.766.jpg
├── dog.7670.jpg
├── dog.7671.jpg
├── dog.7672.jpg
├── dog.7673.jpg
├── dog.7674.jpg
├── dog.7675.jpg
├── dog.7676.jpg
├── dog.7677.jpg
├── dog.7678.jpg
├── dog.7679.jpg
├── dog.767.jpg
├── dog.7680.jpg
├── dog.7681.jpg
├── dog.7682.jpg
├── dog.7683.jpg
├── dog.7684.jpg
├── dog.7685.jpg
├── dog.7686.jpg
├── dog.7687.jpg
├── dog.7688.jpg
├── dog.7689.jpg
├── dog.768.jpg
├── dog.7690.jpg
├── dog.7691.jpg
├── dog.7692.jpg
├── dog.7693.jpg
├── dog.7694.jpg
├── dog.7695.jpg
├── dog.7696.jpg
├── dog.7697.jpg
├── dog.7698.jpg
├── dog.7699.jpg
├── dog.769.jpg
├── dog.76.jpg
├── dog.7700.jpg
├── dog.7701.jpg
├── dog.7702.jpg
├── dog.7703.jpg
├── dog.7704.jpg
├── dog.7705.jpg
├── dog.7706.jpg
├── dog.7707.jpg
├── dog.7708.jpg
├── dog.7709.jpg
├── dog.770.jpg
├── dog.7710.jpg
├── dog.7711.jpg
├── dog.7712.jpg
├── dog.7713.jpg
├── dog.7714.jpg
├── dog.7715.jpg
├── dog.7716.jpg
├── dog.7717.jpg
├── dog.7718.jpg
├── dog.7719.jpg
├── dog.771.jpg
├── dog.7720.jpg
├── dog.7721.jpg
├── dog.7722.jpg
├── dog.7723.jpg
├── dog.7724.jpg
├── dog.7725.jpg
├── dog.7726.jpg
├── dog.7727.jpg
├── dog.7728.jpg
├── dog.7729.jpg
├── dog.772.jpg
├── dog.7730.jpg
├── dog.7731.jpg
├── dog.7732.jpg
├── dog.7733.jpg
├── dog.7734.jpg
├── dog.7735.jpg
├── dog.7736.jpg
├── dog.7737.jpg
├── dog.7738.jpg
├── dog.7739.jpg
├── dog.773.jpg
├── dog.7740.jpg
├── dog.7741.jpg
├── dog.7742.jpg
├── dog.7743.jpg
├── dog.7744.jpg
├── dog.7745.jpg
├── dog.7746.jpg
├── dog.7747.jpg
├── dog.7748.jpg
├── dog.7749.jpg
├── dog.774.jpg
├── dog.7750.jpg
├── dog.7751.jpg
├── dog.7752.jpg
├── dog.7753.jpg
├── dog.7754.jpg
├── dog.7755.jpg
├── dog.7756.jpg
├── dog.7757.jpg
├── dog.7758.jpg
├── dog.7759.jpg
├── dog.775.jpg
├── dog.7760.jpg
├── dog.7761.jpg
├── dog.7762.jpg
├── dog.7763.jpg
├── dog.7764.jpg
├── dog.7765.jpg
├── dog.7766.jpg
├── dog.7767.jpg
├── dog.7768.jpg
├── dog.7769.jpg
├── dog.776.jpg
├── dog.7770.jpg
├── dog.7771.jpg
├── dog.7772.jpg
├── dog.7773.jpg
├── dog.7774.jpg
├── dog.7775.jpg
├── dog.7776.jpg
├── dog.7777.jpg
├── dog.7778.jpg
├── dog.7779.jpg
├── dog.777.jpg
├── dog.7780.jpg
├── dog.7781.jpg
├── dog.7782.jpg
├── dog.7783.jpg
├── dog.7784.jpg
├── dog.7785.jpg
├── dog.7786.jpg
├── dog.7787.jpg
├── dog.7788.jpg
├── dog.7789.jpg
├── dog.778.jpg
├── dog.7790.jpg
├── dog.7791.jpg
├── dog.7792.jpg
├── dog.7793.jpg
├── dog.7794.jpg
├── dog.7795.jpg
├── dog.7796.jpg
├── dog.7797.jpg
├── dog.7798.jpg
├── dog.7799.jpg
├── dog.779.jpg
├── dog.77.jpg
├── dog.7800.jpg
├── dog.7801.jpg
├── dog.7802.jpg
├── dog.7803.jpg
├── dog.7804.jpg
├── dog.7805.jpg
├── dog.7806.jpg
├── dog.7807.jpg
├── dog.7808.jpg
├── dog.7809.jpg
├── dog.780.jpg
├── dog.7810.jpg
├── dog.7811.jpg
├── dog.7812.jpg
├── dog.7813.jpg
├── dog.7814.jpg
├── dog.7815.jpg
├── dog.7816.jpg
├── dog.7817.jpg
├── dog.7818.jpg
├── dog.7819.jpg
├── dog.781.jpg
├── dog.7820.jpg
├── dog.7821.jpg
├── dog.7822.jpg
├── dog.7823.jpg
├── dog.7824.jpg
├── dog.7825.jpg
├── dog.7826.jpg
├── dog.7827.jpg
├── dog.7828.jpg
├── dog.7829.jpg
├── dog.782.jpg
├── dog.7830.jpg
├── dog.7831.jpg
├── dog.7832.jpg
├── dog.7833.jpg
├── dog.7834.jpg
├── dog.7835.jpg
├── dog.7836.jpg
├── dog.7837.jpg
├── dog.7838.jpg
├── dog.7839.jpg
├── dog.783.jpg
├── dog.7840.jpg
├── dog.7841.jpg
├── dog.7842.jpg
├── dog.7843.jpg
├── dog.7844.jpg
├── dog.7845.jpg
├── dog.7846.jpg
├── dog.7847.jpg
├── dog.7848.jpg
├── dog.7849.jpg
├── dog.784.jpg
├── dog.7850.jpg
├── dog.7851.jpg
├── dog.7852.jpg
├── dog.7853.jpg
├── dog.7854.jpg
├── dog.7855.jpg
├── dog.7856.jpg
├── dog.7857.jpg
├── dog.7858.jpg
├── dog.7859.jpg
├── dog.785.jpg
├── dog.7860.jpg
├── dog.7861.jpg
├── dog.7862.jpg
├── dog.7863.jpg
├── dog.7864.jpg
├── dog.7865.jpg
├── dog.7866.jpg
├── dog.7867.jpg
├── dog.7868.jpg
├── dog.7869.jpg
├── dog.786.jpg
├── dog.7870.jpg
├── dog.7871.jpg
├── dog.7872.jpg
├── dog.7873.jpg
├── dog.7874.jpg
├── dog.7875.jpg
├── dog.7876.jpg
├── dog.7877.jpg
├── dog.7878.jpg
├── dog.7879.jpg
├── dog.787.jpg
├── dog.7880.jpg
├── dog.7881.jpg
├── dog.7882.jpg
├── dog.7883.jpg
├── dog.7884.jpg
├── dog.7885.jpg
├── dog.7886.jpg
├── dog.7887.jpg
├── dog.7888.jpg
├── dog.7889.jpg
├── dog.788.jpg
├── dog.7890.jpg
├── dog.7891.jpg
├── dog.7892.jpg
├── dog.7893.jpg
├── dog.7894.jpg
├── dog.7895.jpg
├── dog.7896.jpg
├── dog.7897.jpg
├── dog.7898.jpg
├── dog.7899.jpg
├── dog.789.jpg
├── dog.78.jpg
├── dog.7900.jpg
├── dog.7901.jpg
├── dog.7902.jpg
├── dog.7903.jpg
├── dog.7904.jpg
├── dog.7905.jpg
├── dog.7906.jpg
├── dog.7907.jpg
├── dog.7908.jpg
├── dog.7909.jpg
├── dog.790.jpg
├── dog.7910.jpg
├── dog.7911.jpg
├── dog.7912.jpg
├── dog.7913.jpg
├── dog.7914.jpg
├── dog.7915.jpg
├── dog.7916.jpg
├── dog.7917.jpg
├── dog.7918.jpg
├── dog.7919.jpg
├── dog.791.jpg
├── dog.7920.jpg
├── dog.7921.jpg
├── dog.7922.jpg
├── dog.7923.jpg
├── dog.7924.jpg
├── dog.7925.jpg
├── dog.7926.jpg
├── dog.7927.jpg
├── dog.7928.jpg
├── dog.7929.jpg
├── dog.792.jpg
├── dog.7930.jpg
├── dog.7931.jpg
├── dog.7932.jpg
├── dog.7933.jpg
├── dog.7934.jpg
├── dog.7935.jpg
├── dog.7936.jpg
├── dog.7937.jpg
├── dog.7938.jpg
├── dog.7939.jpg
├── dog.793.jpg
├── dog.7940.jpg
├── dog.7941.jpg
├── dog.7942.jpg
├── dog.7943.jpg
├── dog.7944.jpg
├── dog.7945.jpg
├── dog.7946.jpg
├── dog.7947.jpg
├── dog.7948.jpg
├── dog.7949.jpg
├── dog.794.jpg
├── dog.7950.jpg
├── dog.7951.jpg
├── dog.7952.jpg
├── dog.7953.jpg
├── dog.7954.jpg
├── dog.7955.jpg
├── dog.7956.jpg
├── dog.7957.jpg
├── dog.7958.jpg
├── dog.7959.jpg
├── dog.795.jpg
├── dog.7960.jpg
├── dog.7961.jpg
├── dog.7962.jpg
├── dog.7963.jpg
├── dog.7964.jpg
├── dog.7965.jpg
├── dog.7966.jpg
├── dog.7967.jpg
├── dog.7968.jpg
├── dog.7969.jpg
├── dog.796.jpg
├── dog.7970.jpg
├── dog.7971.jpg
├── dog.7972.jpg
├── dog.7973.jpg
├── dog.7974.jpg
├── dog.7975.jpg
├── dog.7976.jpg
├── dog.7977.jpg
├── dog.7978.jpg
├── dog.7979.jpg
├── dog.797.jpg
├── dog.7980.jpg
├── dog.7981.jpg
├── dog.7982.jpg
├── dog.7983.jpg
├── dog.7984.jpg
├── dog.7985.jpg
├── dog.7986.jpg
├── dog.7987.jpg
├── dog.7988.jpg
├── dog.7989.jpg
├── dog.798.jpg
├── dog.7990.jpg
├── dog.7991.jpg
├── dog.7992.jpg
├── dog.7993.jpg
├── dog.7994.jpg
├── dog.7995.jpg
├── dog.7996.jpg
├── dog.7997.jpg
├── dog.7998.jpg
├── dog.7999.jpg
├── dog.799.jpg
├── dog.79.jpg
├── dog.7.jpg
├── dog.8000.jpg
├── dog.8001.jpg
├── dog.8002.jpg
├── dog.8003.jpg
├── dog.8004.jpg
├── dog.8005.jpg
├── dog.8006.jpg
├── dog.8007.jpg
├── dog.8008.jpg
├── dog.8009.jpg
├── dog.800.jpg
├── dog.8010.jpg
├── dog.8011.jpg
├── dog.8012.jpg
├── dog.8013.jpg
├── dog.8014.jpg
├── dog.8015.jpg
├── dog.8016.jpg
├── dog.8017.jpg
├── dog.8018.jpg
├── dog.8019.jpg
├── dog.801.jpg
├── dog.8020.jpg
├── dog.8021.jpg
├── dog.8022.jpg
├── dog.8023.jpg
├── dog.8024.jpg
├── dog.8025.jpg
├── dog.8026.jpg
├── dog.8027.jpg
├── dog.8028.jpg
├── dog.8029.jpg
├── dog.802.jpg
├── dog.8030.jpg
├── dog.8031.jpg
├── dog.8032.jpg
├── dog.8033.jpg
├── dog.8034.jpg
├── dog.8035.jpg
├── dog.8036.jpg
├── dog.8037.jpg
├── dog.8038.jpg
├── dog.8039.jpg
├── dog.803.jpg
├── dog.8040.jpg
├── dog.8041.jpg
├── dog.8042.jpg
├── dog.8043.jpg
├── dog.8044.jpg
├── dog.8045.jpg
├── dog.8046.jpg
├── dog.8047.jpg
├── dog.8048.jpg
├── dog.8049.jpg
├── dog.804.jpg
├── dog.8050.jpg
├── dog.8051.jpg
├── dog.8052.jpg
├── dog.8053.jpg
├── dog.8054.jpg
├── dog.8055.jpg
├── dog.8056.jpg
├── dog.8057.jpg
├── dog.8058.jpg
├── dog.8059.jpg
├── dog.805.jpg
├── dog.8060.jpg
├── dog.8061.jpg
├── dog.8062.jpg
├── dog.8063.jpg
├── dog.8064.jpg
├── dog.8065.jpg
├── dog.8066.jpg
├── dog.8067.jpg
├── dog.8068.jpg
├── dog.8069.jpg
├── dog.806.jpg
├── dog.8070.jpg
├── dog.8071.jpg
├── dog.8072.jpg
├── dog.8073.jpg
├── dog.8074.jpg
├── dog.8075.jpg
├── dog.8076.jpg
├── dog.8077.jpg
├── dog.8078.jpg
├── dog.8079.jpg
├── dog.807.jpg
├── dog.8080.jpg
├── dog.8081.jpg
├── dog.8082.jpg
├── dog.8083.jpg
├── dog.8084.jpg
├── dog.8085.jpg
├── dog.8086.jpg
├── dog.8087.jpg
├── dog.8088.jpg
├── dog.8089.jpg
├── dog.808.jpg
├── dog.8090.jpg
├── dog.8091.jpg
├── dog.8092.jpg
├── dog.8093.jpg
├── dog.8094.jpg
├── dog.8095.jpg
├── dog.8096.jpg
├── dog.8097.jpg
├── dog.8098.jpg
├── dog.8099.jpg
├── dog.809.jpg
├── dog.80.jpg
├── dog.8100.jpg
├── dog.8101.jpg
├── dog.8102.jpg
├── dog.8103.jpg
├── dog.8104.jpg
├── dog.8105.jpg
├── dog.8106.jpg
├── dog.8107.jpg
├── dog.8108.jpg
├── dog.8109.jpg
├── dog.810.jpg
├── dog.8110.jpg
├── dog.8111.jpg
├── dog.8112.jpg
├── dog.8113.jpg
├── dog.8114.jpg
├── dog.8115.jpg
├── dog.8116.jpg
├── dog.8117.jpg
├── dog.8118.jpg
├── dog.8119.jpg
├── dog.811.jpg
├── dog.8120.jpg
├── dog.8121.jpg
├── dog.8122.jpg
├── dog.8123.jpg
├── dog.8124.jpg
├── dog.8125.jpg
├── dog.8126.jpg
├── dog.8127.jpg
├── dog.8128.jpg
├── dog.8129.jpg
├── dog.812.jpg
├── dog.8130.jpg
├── dog.8131.jpg
├── dog.8132.jpg
├── dog.8133.jpg
├── dog.8134.jpg
├── dog.8135.jpg
├── dog.8136.jpg
├── dog.8137.jpg
├── dog.8138.jpg
├── dog.8139.jpg
├── dog.813.jpg
├── dog.8140.jpg
├── dog.8141.jpg
├── dog.8142.jpg
├── dog.8143.jpg
├── dog.8144.jpg
├── dog.8145.jpg
├── dog.8146.jpg
├── dog.8147.jpg
├── dog.8148.jpg
├── dog.8149.jpg
├── dog.814.jpg
├── dog.8150.jpg
├── dog.8151.jpg
├── dog.8152.jpg
├── dog.8153.jpg
├── dog.8154.jpg
├── dog.8155.jpg
├── dog.8156.jpg
├── dog.8157.jpg
├── dog.8158.jpg
├── dog.8159.jpg
├── dog.815.jpg
├── dog.8160.jpg
├── dog.8161.jpg
├── dog.8162.jpg
├── dog.8163.jpg
├── dog.8164.jpg
├── dog.8165.jpg
├── dog.8166.jpg
├── dog.8167.jpg
├── dog.8168.jpg
├── dog.8169.jpg
├── dog.816.jpg
├── dog.8170.jpg
├── dog.8171.jpg
├── dog.8172.jpg
├── dog.8173.jpg
├── dog.8174.jpg
├── dog.8175.jpg
├── dog.8176.jpg
├── dog.8177.jpg
├── dog.8178.jpg
├── dog.8179.jpg
├── dog.817.jpg
├── dog.8180.jpg
├── dog.8181.jpg
├── dog.8182.jpg
├── dog.8183.jpg
├── dog.8184.jpg
├── dog.8185.jpg
├── dog.8186.jpg
├── dog.8187.jpg
├── dog.8188.jpg
├── dog.8189.jpg
├── dog.818.jpg
├── dog.8190.jpg
├── dog.8191.jpg
├── dog.8192.jpg
├── dog.8193.jpg
├── dog.8194.jpg
├── dog.8195.jpg
├── dog.8196.jpg
├── dog.8197.jpg
├── dog.8198.jpg
├── dog.8199.jpg
├── dog.819.jpg
├── dog.81.jpg
├── dog.8200.jpg
├── dog.8201.jpg
├── dog.8202.jpg
├── dog.8203.jpg
├── dog.8204.jpg
├── dog.8205.jpg
├── dog.8206.jpg
├── dog.8207.jpg
├── dog.8208.jpg
├── dog.8209.jpg
├── dog.820.jpg
├── dog.8210.jpg
├── dog.8211.jpg
├── dog.8212.jpg
├── dog.8213.jpg
├── dog.8214.jpg
├── dog.8215.jpg
├── dog.8216.jpg
├── dog.8217.jpg
├── dog.8218.jpg
├── dog.8219.jpg
├── dog.821.jpg
├── dog.8220.jpg
├── dog.8221.jpg
├── dog.8222.jpg
├── dog.8223.jpg
├── dog.8224.jpg
├── dog.8225.jpg
├── dog.8226.jpg
├── dog.8227.jpg
├── dog.8228.jpg
├── dog.8229.jpg
├── dog.822.jpg
├── dog.8230.jpg
├── dog.8231.jpg
├── dog.8232.jpg
├── dog.8233.jpg
├── dog.8234.jpg
├── dog.8235.jpg
├── dog.8236.jpg
├── dog.8237.jpg
├── dog.8238.jpg
├── dog.8239.jpg
├── dog.823.jpg
├── dog.8240.jpg
├── dog.8241.jpg
├── dog.8242.jpg
├── dog.8243.jpg
├── dog.8244.jpg
├── dog.8245.jpg
├── dog.8246.jpg
├── dog.8247.jpg
├── dog.8248.jpg
├── dog.8249.jpg
├── dog.824.jpg
├── dog.8250.jpg
├── dog.8251.jpg
├── dog.8252.jpg
├── dog.8253.jpg
├── dog.8254.jpg
├── dog.8255.jpg
├── dog.8256.jpg
├── dog.8257.jpg
├── dog.8258.jpg
├── dog.8259.jpg
├── dog.825.jpg
├── dog.8260.jpg
├── dog.8261.jpg
├── dog.8262.jpg
├── dog.8263.jpg
├── dog.8264.jpg
├── dog.8265.jpg
├── dog.8266.jpg
├── dog.8267.jpg
├── dog.8268.jpg
├── dog.8269.jpg
├── dog.826.jpg
├── dog.8270.jpg
├── dog.8271.jpg
├── dog.8272.jpg
├── dog.8273.jpg
├── dog.8274.jpg
├── dog.8275.jpg
├── dog.8276.jpg
├── dog.8277.jpg
├── dog.8278.jpg
├── dog.8279.jpg
├── dog.827.jpg
├── dog.8280.jpg
├── dog.8281.jpg
├── dog.8282.jpg
├── dog.8283.jpg
├── dog.8284.jpg
├── dog.8285.jpg
├── dog.8286.jpg
├── dog.8287.jpg
├── dog.8288.jpg
├── dog.8289.jpg
├── dog.828.jpg
├── dog.8290.jpg
├── dog.8291.jpg
├── dog.8292.jpg
├── dog.8293.jpg
├── dog.8294.jpg
├── dog.8295.jpg
├── dog.8296.jpg
├── dog.8297.jpg
├── dog.8298.jpg
├── dog.8299.jpg
├── dog.829.jpg
├── dog.82.jpg
├── dog.8300.jpg
├── dog.8301.jpg
├── dog.8302.jpg
├── dog.8303.jpg
├── dog.8304.jpg
├── dog.8305.jpg
├── dog.8306.jpg
├── dog.8307.jpg
├── dog.8308.jpg
├── dog.8309.jpg
├── dog.830.jpg
├── dog.8310.jpg
├── dog.8311.jpg
├── dog.8312.jpg
├── dog.8313.jpg
├── dog.8314.jpg
├── dog.8315.jpg
├── dog.8316.jpg
├── dog.8317.jpg
├── dog.8318.jpg
├── dog.8319.jpg
├── dog.831.jpg
├── dog.8320.jpg
├── dog.8321.jpg
├── dog.8322.jpg
├── dog.8323.jpg
├── dog.8324.jpg
├── dog.8325.jpg
├── dog.8326.jpg
├── dog.8327.jpg
├── dog.8328.jpg
├── dog.8329.jpg
├── dog.832.jpg
├── dog.8330.jpg
├── dog.8331.jpg
├── dog.8332.jpg
├── dog.8333.jpg
├── dog.8334.jpg
├── dog.8335.jpg
├── dog.8336.jpg
├── dog.8337.jpg
├── dog.8338.jpg
├── dog.8339.jpg
├── dog.833.jpg
├── dog.8340.jpg
├── dog.8341.jpg
├── dog.8342.jpg
├── dog.8343.jpg
├── dog.8344.jpg
├── dog.8345.jpg
├── dog.8346.jpg
├── dog.8347.jpg
├── dog.8348.jpg
├── dog.8349.jpg
├── dog.834.jpg
├── dog.8350.jpg
├── dog.8351.jpg
├── dog.8352.jpg
├── dog.8353.jpg
├── dog.8354.jpg
├── dog.8355.jpg
├── dog.8356.jpg
├── dog.8357.jpg
├── dog.8358.jpg
├── dog.8359.jpg
├── dog.835.jpg
├── dog.8360.jpg
├── dog.8361.jpg
├── dog.8362.jpg
├── dog.8363.jpg
├── dog.8364.jpg
├── dog.8365.jpg
├── dog.8366.jpg
├── dog.8367.jpg
├── dog.8368.jpg
├── dog.8369.jpg
├── dog.836.jpg
├── dog.8370.jpg
├── dog.8371.jpg
├── dog.8372.jpg
├── dog.8373.jpg
├── dog.8374.jpg
├── dog.8375.jpg
├── dog.8376.jpg
├── dog.8377.jpg
├── dog.8378.jpg
├── dog.8379.jpg
├── dog.837.jpg
├── dog.8380.jpg
├── dog.8381.jpg
├── dog.8382.jpg
├── dog.8383.jpg
├── dog.8384.jpg
├── dog.8385.jpg
├── dog.8386.jpg
├── dog.8387.jpg
├── dog.8388.jpg
├── dog.8389.jpg
├── dog.838.jpg
├── dog.8390.jpg
├── dog.8391.jpg
├── dog.8392.jpg
├── dog.8393.jpg
├── dog.8394.jpg
├── dog.8395.jpg
├── dog.8396.jpg
├── dog.8397.jpg
├── dog.8398.jpg
├── dog.8399.jpg
├── dog.839.jpg
├── dog.83.jpg
├── dog.8400.jpg
├── dog.8401.jpg
├── dog.8402.jpg
├── dog.8403.jpg
├── dog.8404.jpg
├── dog.8405.jpg
├── dog.8406.jpg
├── dog.8407.jpg
├── dog.8408.jpg
├── dog.8409.jpg
├── dog.840.jpg
├── dog.8410.jpg
├── dog.8411.jpg
├── dog.8412.jpg
├── dog.8413.jpg
├── dog.8414.jpg
├── dog.8415.jpg
├── dog.8416.jpg
├── dog.8417.jpg
├── dog.8418.jpg
├── dog.8419.jpg
├── dog.841.jpg
├── dog.8420.jpg
├── dog.8421.jpg
├── dog.8422.jpg
├── dog.8423.jpg
├── dog.8424.jpg
├── dog.8425.jpg
├── dog.8426.jpg
├── dog.8427.jpg
├── dog.8428.jpg
├── dog.8429.jpg
├── dog.842.jpg
├── dog.8430.jpg
├── dog.8431.jpg
├── dog.8432.jpg
├── dog.8433.jpg
├── dog.8434.jpg
├── dog.8435.jpg
├── dog.8436.jpg
├── dog.8437.jpg
├── dog.8438.jpg
├── dog.8439.jpg
├── dog.843.jpg
├── dog.8440.jpg
├── dog.8441.jpg
├── dog.8442.jpg
├── dog.8443.jpg
├── dog.8444.jpg
├── dog.8445.jpg
├── dog.8446.jpg
├── dog.8447.jpg
├── dog.8448.jpg
├── dog.8449.jpg
├── dog.844.jpg
├── dog.8450.jpg
├── dog.8451.jpg
├── dog.8452.jpg
├── dog.8453.jpg
├── dog.8454.jpg
├── dog.8455.jpg
├── dog.8456.jpg
├── dog.8457.jpg
├── dog.8458.jpg
├── dog.8459.jpg
├── dog.845.jpg
├── dog.8460.jpg
├── dog.8461.jpg
├── dog.8462.jpg
├── dog.8463.jpg
├── dog.8464.jpg
├── dog.8465.jpg
├── dog.8466.jpg
├── dog.8467.jpg
├── dog.8468.jpg
├── dog.8469.jpg
├── dog.846.jpg
├── dog.8470.jpg
├── dog.8471.jpg
├── dog.8472.jpg
├── dog.8473.jpg
├── dog.8474.jpg
├── dog.8475.jpg
├── dog.8476.jpg
├── dog.8477.jpg
├── dog.8478.jpg
├── dog.8479.jpg
├── dog.847.jpg
├── dog.8480.jpg
├── dog.8481.jpg
├── dog.8482.jpg
├── dog.8483.jpg
├── dog.8484.jpg
├── dog.8485.jpg
├── dog.8486.jpg
├── dog.8487.jpg
├── dog.8488.jpg
├── dog.8489.jpg
├── dog.848.jpg
├── dog.8490.jpg
├── dog.8491.jpg
├── dog.8492.jpg
├── dog.8493.jpg
├── dog.8494.jpg
├── dog.8495.jpg
├── dog.8496.jpg
├── dog.8497.jpg
├── dog.8498.jpg
├── dog.8499.jpg
├── dog.849.jpg
├── dog.84.jpg
├── dog.8500.jpg
├── dog.8501.jpg
├── dog.8502.jpg
├── dog.8503.jpg
├── dog.8504.jpg
├── dog.8505.jpg
├── dog.8506.jpg
├── dog.8507.jpg
├── dog.8508.jpg
├── dog.8509.jpg
├── dog.850.jpg
├── dog.8510.jpg
├── dog.8511.jpg
├── dog.8512.jpg
├── dog.8513.jpg
├── dog.8514.jpg
├── dog.8515.jpg
├── dog.8516.jpg
├── dog.8517.jpg
├── dog.8518.jpg
├── dog.8519.jpg
├── dog.851.jpg
├── dog.8520.jpg
├── dog.8521.jpg
├── dog.8522.jpg
├── dog.8523.jpg
├── dog.8524.jpg
├── dog.8525.jpg
├── dog.8526.jpg
├── dog.8527.jpg
├── dog.8528.jpg
├── dog.8529.jpg
├── dog.852.jpg
├── dog.8530.jpg
├── dog.8531.jpg
├── dog.8532.jpg
├── dog.8533.jpg
├── dog.8534.jpg
├── dog.8535.jpg
├── dog.8536.jpg
├── dog.8537.jpg
├── dog.8538.jpg
├── dog.8539.jpg
├── dog.853.jpg
├── dog.8540.jpg
├── dog.8541.jpg
├── dog.8542.jpg
├── dog.8543.jpg
├── dog.8544.jpg
├── dog.8545.jpg
├── dog.8546.jpg
├── dog.8547.jpg
├── dog.8548.jpg
├── dog.8549.jpg
├── dog.854.jpg
├── dog.8550.jpg
├── dog.8551.jpg
├── dog.8552.jpg
├── dog.8553.jpg
├── dog.8554.jpg
├── dog.8555.jpg
├── dog.8556.jpg
├── dog.8557.jpg
├── dog.8558.jpg
├── dog.8559.jpg
├── dog.855.jpg
├── dog.8560.jpg
├── dog.8561.jpg
├── dog.8562.jpg
├── dog.8563.jpg
├── dog.8564.jpg
├── dog.8565.jpg
├── dog.8566.jpg
├── dog.8567.jpg
├── dog.8568.jpg
├── dog.8569.jpg
├── dog.856.jpg
├── dog.8570.jpg
├── dog.8571.jpg
├── dog.8572.jpg
├── dog.8573.jpg
├── dog.8574.jpg
├── dog.8575.jpg
├── dog.8576.jpg
├── dog.8577.jpg
├── dog.8578.jpg
├── dog.8579.jpg
├── dog.857.jpg
├── dog.8580.jpg
├── dog.8581.jpg
├── dog.8582.jpg
├── dog.8583.jpg
├── dog.8584.jpg
├── dog.8585.jpg
├── dog.8586.jpg
├── dog.8587.jpg
├── dog.8588.jpg
├── dog.8589.jpg
├── dog.858.jpg
├── dog.8590.jpg
├── dog.8591.jpg
├── dog.8592.jpg
├── dog.8593.jpg
├── dog.8594.jpg
├── dog.8595.jpg
├── dog.8596.jpg
├── dog.8597.jpg
├── dog.8598.jpg
├── dog.8599.jpg
├── dog.859.jpg
├── dog.85.jpg
├── dog.8600.jpg
├── dog.8601.jpg
├── dog.8602.jpg
├── dog.8603.jpg
├── dog.8604.jpg
├── dog.8605.jpg
├── dog.8606.jpg
├── dog.8607.jpg
├── dog.8608.jpg
├── dog.8609.jpg
├── dog.860.jpg
├── dog.8610.jpg
├── dog.8611.jpg
├── dog.8612.jpg
├── dog.8613.jpg
├── dog.8614.jpg
├── dog.8615.jpg
├── dog.8616.jpg
├── dog.8617.jpg
├── dog.8618.jpg
├── dog.8619.jpg
├── dog.861.jpg
├── dog.8620.jpg
├── dog.8621.jpg
├── dog.8622.jpg
├── dog.8623.jpg
├── dog.8624.jpg
├── dog.8625.jpg
├── dog.8626.jpg
├── dog.8627.jpg
├── dog.8628.jpg
├── dog.8629.jpg
├── dog.862.jpg
├── dog.8630.jpg
├── dog.8631.jpg
├── dog.8632.jpg
├── dog.8633.jpg
├── dog.8634.jpg
├── dog.8635.jpg
├── dog.8636.jpg
├── dog.8637.jpg
├── dog.8638.jpg
├── dog.8639.jpg
├── dog.863.jpg
├── dog.8640.jpg
├── dog.8641.jpg
├── dog.8642.jpg
├── dog.8643.jpg
├── dog.8644.jpg
├── dog.8645.jpg
├── dog.8646.jpg
├── dog.8647.jpg
├── dog.8648.jpg
├── dog.8649.jpg
├── dog.864.jpg
├── dog.8650.jpg
├── dog.8651.jpg
├── dog.8652.jpg
├── dog.8653.jpg
├── dog.8654.jpg
├── dog.8655.jpg
├── dog.8656.jpg
├── dog.8657.jpg
├── dog.8658.jpg
├── dog.8659.jpg
├── dog.865.jpg
├── dog.8660.jpg
├── dog.8661.jpg
├── dog.8662.jpg
├── dog.8663.jpg
├── dog.8664.jpg
├── dog.8665.jpg
├── dog.8666.jpg
├── dog.8667.jpg
├── dog.8668.jpg
├── dog.8669.jpg
├── dog.866.jpg
├── dog.8670.jpg
├── dog.8671.jpg
├── dog.8672.jpg
├── dog.8673.jpg
├── dog.8674.jpg
├── dog.8675.jpg
├── dog.8676.jpg
├── dog.8677.jpg
├── dog.8678.jpg
├── dog.8679.jpg
├── dog.867.jpg
├── dog.8680.jpg
├── dog.8681.jpg
├── dog.8682.jpg
├── dog.8683.jpg
├── dog.8684.jpg
├── dog.8685.jpg
├── dog.8686.jpg
├── dog.8687.jpg
├── dog.8688.jpg
├── dog.8689.jpg
├── dog.868.jpg
├── dog.8690.jpg
├── dog.8691.jpg
├── dog.8692.jpg
├── dog.8693.jpg
├── dog.8694.jpg
├── dog.8695.jpg
├── dog.8696.jpg
├── dog.8697.jpg
├── dog.8698.jpg
├── dog.8699.jpg
├── dog.869.jpg
├── dog.86.jpg
├── dog.8700.jpg
├── dog.8701.jpg
├── dog.8702.jpg
├── dog.8703.jpg
├── dog.8704.jpg
├── dog.8705.jpg
├── dog.8706.jpg
├── dog.8707.jpg
├── dog.8708.jpg
├── dog.8709.jpg
├── dog.870.jpg
├── dog.8710.jpg
├── dog.8711.jpg
├── dog.8712.jpg
├── dog.8713.jpg
├── dog.8714.jpg
├── dog.8715.jpg
├── dog.8716.jpg
├── dog.8717.jpg
├── dog.8718.jpg
├── dog.8719.jpg
├── dog.871.jpg
├── dog.8720.jpg
├── dog.8721.jpg
├── dog.8722.jpg
├── dog.8723.jpg
├── dog.8724.jpg
├── dog.8725.jpg
├── dog.8726.jpg
├── dog.8727.jpg
├── dog.8728.jpg
├── dog.8729.jpg
├── dog.872.jpg
├── dog.8730.jpg
├── dog.8731.jpg
├── dog.8732.jpg
├── dog.8733.jpg
├── dog.8734.jpg
├── dog.8735.jpg
├── dog.8736.jpg
├── dog.8737.jpg
├── dog.8738.jpg
├── dog.8739.jpg
├── dog.873.jpg
├── dog.8740.jpg
├── dog.8741.jpg
├── dog.8742.jpg
├── dog.8743.jpg
├── dog.8744.jpg
├── dog.8745.jpg
├── dog.8746.jpg
├── dog.8747.jpg
├── dog.8748.jpg
├── dog.8749.jpg
├── dog.874.jpg
├── dog.8750.jpg
├── dog.8751.jpg
├── dog.8752.jpg
├── dog.8753.jpg
├── dog.8754.jpg
├── dog.8755.jpg
├── dog.8756.jpg
├── dog.8757.jpg
├── dog.8758.jpg
├── dog.8759.jpg
├── dog.875.jpg
├── dog.8760.jpg
├── dog.8761.jpg
├── dog.8762.jpg
├── dog.8763.jpg
├── dog.8764.jpg
├── dog.8765.jpg
├── dog.8766.jpg
├── dog.8767.jpg
├── dog.8768.jpg
├── dog.8769.jpg
├── dog.876.jpg
├── dog.8770.jpg
├── dog.8771.jpg
├── dog.8772.jpg
├── dog.8773.jpg
├── dog.8774.jpg
├── dog.8775.jpg
├── dog.8776.jpg
├── dog.8777.jpg
├── dog.8778.jpg
├── dog.8779.jpg
├── dog.877.jpg
├── dog.8780.jpg
├── dog.8781.jpg
├── dog.8782.jpg
├── dog.8783.jpg
├── dog.8784.jpg
├── dog.8785.jpg
├── dog.8786.jpg
├── dog.8787.jpg
├── dog.8788.jpg
├── dog.8789.jpg
├── dog.878.jpg
├── dog.8790.jpg
├── dog.8791.jpg
├── dog.8792.jpg
├── dog.8793.jpg
├── dog.8794.jpg
├── dog.8795.jpg
├── dog.8796.jpg
├── dog.8797.jpg
├── dog.8798.jpg
├── dog.8799.jpg
├── dog.879.jpg
├── dog.87.jpg
├── dog.8800.jpg
├── dog.8801.jpg
├── dog.8802.jpg
├── dog.8803.jpg
├── dog.8804.jpg
├── dog.8805.jpg
├── dog.8806.jpg
├── dog.8807.jpg
├── dog.8808.jpg
├── dog.8809.jpg
├── dog.880.jpg
├── dog.8810.jpg
├── dog.8811.jpg
├── dog.8812.jpg
├── dog.8813.jpg
├── dog.8814.jpg
├── dog.8815.jpg
├── dog.8816.jpg
├── dog.8817.jpg
├── dog.8818.jpg
├── dog.8819.jpg
├── dog.881.jpg
├── dog.8820.jpg
├── dog.8821.jpg
├── dog.8822.jpg
├── dog.8823.jpg
├── dog.8824.jpg
├── dog.8825.jpg
├── dog.8826.jpg
├── dog.8827.jpg
├── dog.8828.jpg
├── dog.8829.jpg
├── dog.882.jpg
├── dog.8830.jpg
├── dog.8831.jpg
├── dog.8832.jpg
├── dog.8833.jpg
├── dog.8834.jpg
├── dog.8835.jpg
├── dog.8836.jpg
├── dog.8837.jpg
├── dog.8838.jpg
├── dog.8839.jpg
├── dog.883.jpg
├── dog.8840.jpg
├── dog.8841.jpg
├── dog.8842.jpg
├── dog.8843.jpg
├── dog.8844.jpg
├── dog.8845.jpg
├── dog.8846.jpg
├── dog.8847.jpg
├── dog.8848.jpg
├── dog.8849.jpg
├── dog.884.jpg
├── dog.8850.jpg
├── dog.8851.jpg
├── dog.8852.jpg
├── dog.8853.jpg
├── dog.8854.jpg
├── dog.8855.jpg
├── dog.8856.jpg
├── dog.8857.jpg
├── dog.8858.jpg
├── dog.8859.jpg
├── dog.885.jpg
├── dog.8860.jpg
├── dog.8861.jpg
├── dog.8862.jpg
├── dog.8863.jpg
├── dog.8864.jpg
├── dog.8865.jpg
├── dog.8866.jpg
├── dog.8867.jpg
├── dog.8868.jpg
├── dog.8869.jpg
├── dog.886.jpg
├── dog.8870.jpg
├── dog.8871.jpg
├── dog.8872.jpg
├── dog.8873.jpg
├── dog.8874.jpg
├── dog.8875.jpg
├── dog.8876.jpg
├── dog.8877.jpg
├── dog.8878.jpg
├── dog.8879.jpg
├── dog.887.jpg
├── dog.8880.jpg
├── dog.8881.jpg
├── dog.8882.jpg
├── dog.8883.jpg
├── dog.8884.jpg
├── dog.8885.jpg
├── dog.8886.jpg
├── dog.8887.jpg
├── dog.8888.jpg
├── dog.8889.jpg
├── dog.888.jpg
├── dog.8890.jpg
├── dog.8891.jpg
├── dog.8892.jpg
├── dog.8893.jpg
├── dog.8894.jpg
├── dog.8895.jpg
├── dog.8896.jpg
├── dog.8897.jpg
├── dog.8898.jpg
├── dog.8899.jpg
├── dog.889.jpg
├── dog.88.jpg
├── dog.8900.jpg
├── dog.8901.jpg
├── dog.8902.jpg
├── dog.8903.jpg
├── dog.8904.jpg
├── dog.8905.jpg
├── dog.8906.jpg
├── dog.8907.jpg
├── dog.8908.jpg
├── dog.8909.jpg
├── dog.890.jpg
├── dog.8910.jpg
├── dog.8911.jpg
├── dog.8912.jpg
├── dog.8913.jpg
├── dog.8914.jpg
├── dog.8915.jpg
├── dog.8916.jpg
├── dog.8917.jpg
├── dog.8918.jpg
├── dog.8919.jpg
├── dog.891.jpg
├── dog.8920.jpg
├── dog.8921.jpg
├── dog.8922.jpg
├── dog.8923.jpg
├── dog.8924.jpg
├── dog.8925.jpg
├── dog.8926.jpg
├── dog.8927.jpg
├── dog.8928.jpg
├── dog.8929.jpg
├── dog.892.jpg
├── dog.8930.jpg
├── dog.8931.jpg
├── dog.8932.jpg
├── dog.8933.jpg
├── dog.8934.jpg
├── dog.8935.jpg
├── dog.8936.jpg
├── dog.8937.jpg
├── dog.8938.jpg
├── dog.8939.jpg
├── dog.893.jpg
├── dog.8940.jpg
├── dog.8941.jpg
├── dog.8942.jpg
├── dog.8943.jpg
├── dog.8944.jpg
├── dog.8945.jpg
├── dog.8946.jpg
├── dog.8947.jpg
├── dog.8948.jpg
├── dog.8949.jpg
├── dog.894.jpg
├── dog.8950.jpg
├── dog.8951.jpg
├── dog.8952.jpg
├── dog.8953.jpg
├── dog.8954.jpg
├── dog.8955.jpg
├── dog.8956.jpg
├── dog.8957.jpg
├── dog.8958.jpg
├── dog.8959.jpg
├── dog.895.jpg
├── dog.8960.jpg
├── dog.8961.jpg
├── dog.8962.jpg
├── dog.8963.jpg
├── dog.8964.jpg
├── dog.8965.jpg
├── dog.8966.jpg
├── dog.8967.jpg
├── dog.8968.jpg
├── dog.8969.jpg
├── dog.896.jpg
├── dog.8970.jpg
├── dog.8971.jpg
├── dog.8972.jpg
├── dog.8973.jpg
├── dog.8974.jpg
├── dog.8975.jpg
├── dog.8976.jpg
├── dog.8977.jpg
├── dog.8978.jpg
├── dog.8979.jpg
├── dog.897.jpg
├── dog.8980.jpg
├── dog.8981.jpg
├── dog.8982.jpg
├── dog.8983.jpg
├── dog.8984.jpg
├── dog.8985.jpg
├── dog.8986.jpg
├── dog.8987.jpg
├── dog.8988.jpg
├── dog.8989.jpg
├── dog.898.jpg
├── dog.8990.jpg
├── dog.8991.jpg
├── dog.8992.jpg
├── dog.8993.jpg
├── dog.8994.jpg
├── dog.8995.jpg
├── dog.8996.jpg
├── dog.8997.jpg
├── dog.8998.jpg
├── dog.8999.jpg
├── dog.899.jpg
├── dog.89.jpg
├── dog.8.jpg
├── dog.9000.jpg
├── dog.9001.jpg
├── dog.9002.jpg
├── dog.9003.jpg
├── dog.9004.jpg
├── dog.9005.jpg
├── dog.9006.jpg
├── dog.9007.jpg
├── dog.9008.jpg
├── dog.9009.jpg
├── dog.900.jpg
├── dog.9010.jpg
├── dog.9011.jpg
├── dog.9012.jpg
├── dog.9013.jpg
├── dog.9014.jpg
├── dog.9015.jpg
├── dog.9016.jpg
├── dog.9017.jpg
├── dog.9018.jpg
├── dog.9019.jpg
├── dog.901.jpg
├── dog.9020.jpg
├── dog.9021.jpg
├── dog.9022.jpg
├── dog.9023.jpg
├── dog.9024.jpg
├── dog.9025.jpg
├── dog.9026.jpg
├── dog.9027.jpg
├── dog.9028.jpg
├── dog.9029.jpg
├── dog.902.jpg
├── dog.9030.jpg
├── dog.9031.jpg
├── dog.9032.jpg
├── dog.9033.jpg
├── dog.9034.jpg
├── dog.9035.jpg
├── dog.9036.jpg
├── dog.9037.jpg
├── dog.9038.jpg
├── dog.9039.jpg
├── dog.903.jpg
├── dog.9040.jpg
├── dog.9041.jpg
├── dog.9042.jpg
├── dog.9043.jpg
├── dog.9044.jpg
├── dog.9045.jpg
├── dog.9046.jpg
├── dog.9047.jpg
├── dog.9048.jpg
├── dog.9049.jpg
├── dog.904.jpg
├── dog.9050.jpg
├── dog.9051.jpg
├── dog.9052.jpg
├── dog.9053.jpg
├── dog.9054.jpg
├── dog.9055.jpg
├── dog.9056.jpg
├── dog.9057.jpg
├── dog.9058.jpg
├── dog.9059.jpg
├── dog.905.jpg
├── dog.9060.jpg
├── dog.9061.jpg
├── dog.9062.jpg
├── dog.9063.jpg
├── dog.9064.jpg
├── dog.9065.jpg
├── dog.9066.jpg
├── dog.9067.jpg
├── dog.9068.jpg
├── dog.9069.jpg
├── dog.906.jpg
├── dog.9070.jpg
├── dog.9071.jpg
├── dog.9072.jpg
├── dog.9073.jpg
├── dog.9074.jpg
├── dog.9075.jpg
├── dog.9076.jpg
├── dog.9077.jpg
├── dog.9078.jpg
├── dog.9079.jpg
├── dog.907.jpg
├── dog.9080.jpg
├── dog.9081.jpg
├── dog.9082.jpg
├── dog.9083.jpg
├── dog.9084.jpg
├── dog.9085.jpg
├── dog.9086.jpg
├── dog.9087.jpg
├── dog.9088.jpg
├── dog.9089.jpg
├── dog.908.jpg
├── dog.9090.jpg
├── dog.9091.jpg
├── dog.9092.jpg
├── dog.9093.jpg
├── dog.9094.jpg
├── dog.9095.jpg
├── dog.9096.jpg
├── dog.9097.jpg
├── dog.9098.jpg
├── dog.9099.jpg
├── dog.909.jpg
├── dog.90.jpg
├── dog.9100.jpg
├── dog.9101.jpg
├── dog.9102.jpg
├── dog.9103.jpg
├── dog.9104.jpg
├── dog.9105.jpg
├── dog.9106.jpg
├── dog.9107.jpg
├── dog.9108.jpg
├── dog.9109.jpg
├── dog.910.jpg
├── dog.9110.jpg
├── dog.9111.jpg
├── dog.9112.jpg
├── dog.9113.jpg
├── dog.9114.jpg
├── dog.9115.jpg
├── dog.9116.jpg
├── dog.9117.jpg
├── dog.9118.jpg
├── dog.9119.jpg
├── dog.911.jpg
├── dog.9120.jpg
├── dog.9121.jpg
├── dog.9122.jpg
├── dog.9123.jpg
├── dog.9124.jpg
├── dog.9125.jpg
├── dog.9126.jpg
├── dog.9127.jpg
├── dog.9128.jpg
├── dog.9129.jpg
├── dog.912.jpg
├── dog.9130.jpg
├── dog.9131.jpg
├── dog.9132.jpg
├── dog.9133.jpg
├── dog.9134.jpg
├── dog.9135.jpg
├── dog.9136.jpg
├── dog.9137.jpg
├── dog.9138.jpg
├── dog.9139.jpg
├── dog.913.jpg
├── dog.9140.jpg
├── dog.9141.jpg
├── dog.9142.jpg
├── dog.9143.jpg
├── dog.9144.jpg
├── dog.9145.jpg
├── dog.9146.jpg
├── dog.9147.jpg
├── dog.9148.jpg
├── dog.9149.jpg
├── dog.914.jpg
├── dog.9150.jpg
├── dog.9151.jpg
├── dog.9152.jpg
├── dog.9153.jpg
├── dog.9154.jpg
├── dog.9155.jpg
├── dog.9156.jpg
├── dog.9157.jpg
├── dog.9158.jpg
├── dog.9159.jpg
├── dog.915.jpg
├── dog.9160.jpg
├── dog.9161.jpg
├── dog.9162.jpg
├── dog.9163.jpg
├── dog.9164.jpg
├── dog.9165.jpg
├── dog.9166.jpg
├── dog.9167.jpg
├── dog.9168.jpg
├── dog.9169.jpg
├── dog.916.jpg
├── dog.9170.jpg
├── dog.9171.jpg
├── dog.9172.jpg
├── dog.9173.jpg
├── dog.9174.jpg
├── dog.9175.jpg
├── dog.9176.jpg
├── dog.9177.jpg
├── dog.9178.jpg
├── dog.9179.jpg
├── dog.917.jpg
├── dog.9180.jpg
├── dog.9181.jpg
├── dog.9182.jpg
├── dog.9183.jpg
├── dog.9184.jpg
├── dog.9185.jpg
├── dog.9186.jpg
├── dog.9187.jpg
├── dog.9188.jpg
├── dog.9189.jpg
├── dog.918.jpg
├── dog.9190.jpg
├── dog.9191.jpg
├── dog.9192.jpg
├── dog.9193.jpg
├── dog.9194.jpg
├── dog.9195.jpg
├── dog.9196.jpg
├── dog.9197.jpg
├── dog.9198.jpg
├── dog.9199.jpg
├── dog.919.jpg
├── dog.91.jpg
├── dog.9200.jpg
├── dog.9201.jpg
├── dog.9202.jpg
├── dog.9203.jpg
├── dog.9204.jpg
├── dog.9205.jpg
├── dog.9206.jpg
├── dog.9207.jpg
├── dog.9208.jpg
├── dog.9209.jpg
├── dog.920.jpg
├── dog.9210.jpg
├── dog.9211.jpg
├── dog.9212.jpg
├── dog.9213.jpg
├── dog.9214.jpg
├── dog.9215.jpg
├── dog.9216.jpg
├── dog.9217.jpg
├── dog.9218.jpg
├── dog.9219.jpg
├── dog.921.jpg
├── dog.9220.jpg
├── dog.9221.jpg
├── dog.9222.jpg
├── dog.9223.jpg
├── dog.9224.jpg
├── dog.9225.jpg
├── dog.9226.jpg
├── dog.9227.jpg
├── dog.9228.jpg
├── dog.9229.jpg
├── dog.922.jpg
├── dog.9230.jpg
├── dog.9231.jpg
├── dog.9232.jpg
├── dog.9233.jpg
├── dog.9234.jpg
├── dog.9235.jpg
├── dog.9236.jpg
├── dog.9237.jpg
├── dog.9238.jpg
├── dog.9239.jpg
├── dog.923.jpg
├── dog.9240.jpg
├── dog.9241.jpg
├── dog.9242.jpg
├── dog.9243.jpg
├── dog.9244.jpg
├── dog.9245.jpg
├── dog.9246.jpg
├── dog.9247.jpg
├── dog.9248.jpg
├── dog.9249.jpg
├── dog.924.jpg
├── dog.9250.jpg
├── dog.9251.jpg
├── dog.9252.jpg
├── dog.9253.jpg
├── dog.9254.jpg
├── dog.9255.jpg
├── dog.9256.jpg
├── dog.9257.jpg
├── dog.9258.jpg
├── dog.9259.jpg
├── dog.925.jpg
├── dog.9260.jpg
├── dog.9261.jpg
├── dog.9262.jpg
├── dog.9263.jpg
├── dog.9264.jpg
├── dog.9265.jpg
├── dog.9266.jpg
├── dog.9267.jpg
├── dog.9268.jpg
├── dog.9269.jpg
├── dog.926.jpg
├── dog.9270.jpg
├── dog.9271.jpg
├── dog.9272.jpg
├── dog.9273.jpg
├── dog.9274.jpg
├── dog.9275.jpg
├── dog.9276.jpg
├── dog.9277.jpg
├── dog.9278.jpg
├── dog.9279.jpg
├── dog.927.jpg
├── dog.9280.jpg
├── dog.9281.jpg
├── dog.9282.jpg
├── dog.9283.jpg
├── dog.9284.jpg
├── dog.9285.jpg
├── dog.9286.jpg
├── dog.9287.jpg
├── dog.9288.jpg
├── dog.9289.jpg
├── dog.928.jpg
├── dog.9290.jpg
├── dog.9291.jpg
├── dog.9292.jpg
├── dog.9293.jpg
├── dog.9294.jpg
├── dog.9295.jpg
├── dog.9296.jpg
├── dog.9297.jpg
├── dog.9298.jpg
├── dog.9299.jpg
├── dog.929.jpg
├── dog.92.jpg
├── dog.9300.jpg
├── dog.9301.jpg
├── dog.9302.jpg
├── dog.9303.jpg
├── dog.9304.jpg
├── dog.9305.jpg
├── dog.9306.jpg
├── dog.9307.jpg
├── dog.9308.jpg
├── dog.9309.jpg
├── dog.930.jpg
├── dog.9310.jpg
├── dog.9311.jpg
├── dog.9312.jpg
├── dog.9313.jpg
├── dog.9314.jpg
├── dog.9315.jpg
├── dog.9316.jpg
├── dog.9317.jpg
├── dog.9318.jpg
├── dog.9319.jpg
├── dog.931.jpg
├── dog.9320.jpg
├── dog.9321.jpg
├── dog.9322.jpg
├── dog.9323.jpg
├── dog.9324.jpg
├── dog.9325.jpg
├── dog.9326.jpg
├── dog.9327.jpg
├── dog.9328.jpg
├── dog.9329.jpg
├── dog.932.jpg
├── dog.9330.jpg
├── dog.9331.jpg
├── dog.9332.jpg
├── dog.9333.jpg
├── dog.9334.jpg
├── dog.9335.jpg
├── dog.9336.jpg
├── dog.9337.jpg
├── dog.9338.jpg
├── dog.9339.jpg
├── dog.933.jpg
├── dog.9340.jpg
├── dog.9341.jpg
├── dog.9342.jpg
├── dog.9343.jpg
├── dog.9344.jpg
├── dog.9345.jpg
├── dog.9346.jpg
├── dog.9347.jpg
├── dog.9348.jpg
├── dog.9349.jpg
├── dog.934.jpg
├── dog.9350.jpg
├── dog.9351.jpg
├── dog.9352.jpg
├── dog.9353.jpg
├── dog.9354.jpg
├── dog.9355.jpg
├── dog.9356.jpg
├── dog.9357.jpg
├── dog.9358.jpg
├── dog.9359.jpg
├── dog.935.jpg
├── dog.9360.jpg
├── dog.9361.jpg
├── dog.9362.jpg
├── dog.9363.jpg
├── dog.9364.jpg
├── dog.9365.jpg
├── dog.9366.jpg
├── dog.9367.jpg
├── dog.9368.jpg
├── dog.9369.jpg
├── dog.936.jpg
├── dog.9370.jpg
├── dog.9371.jpg
├── dog.9372.jpg
├── dog.9373.jpg
├── dog.9374.jpg
├── dog.9375.jpg
├── dog.9376.jpg
├── dog.9377.jpg
├── dog.9378.jpg
├── dog.9379.jpg
├── dog.937.jpg
├── dog.9380.jpg
├── dog.9381.jpg
├── dog.9382.jpg
├── dog.9383.jpg
├── dog.9384.jpg
├── dog.9385.jpg
├── dog.9386.jpg
├── dog.9387.jpg
├── dog.9388.jpg
├── dog.9389.jpg
├── dog.938.jpg
├── dog.9390.jpg
├── dog.9391.jpg
├── dog.9392.jpg
├── dog.9393.jpg
├── dog.9394.jpg
├── dog.9395.jpg
├── dog.9396.jpg
├── dog.9397.jpg
├── dog.9398.jpg
├── dog.9399.jpg
├── dog.939.jpg
├── dog.93.jpg
├── dog.9400.jpg
├── dog.9401.jpg
├── dog.9402.jpg
├── dog.9403.jpg
├── dog.9404.jpg
├── dog.9405.jpg
├── dog.9406.jpg
├── dog.9407.jpg
├── dog.9408.jpg
├── dog.9409.jpg
├── dog.940.jpg
├── dog.9410.jpg
├── dog.9411.jpg
├── dog.9412.jpg
├── dog.9413.jpg
├── dog.9414.jpg
├── dog.9415.jpg
├── dog.9416.jpg
├── dog.9417.jpg
├── dog.9418.jpg
├── dog.9419.jpg
├── dog.941.jpg
├── dog.9420.jpg
├── dog.9421.jpg
├── dog.9422.jpg
├── dog.9423.jpg
├── dog.9424.jpg
├── dog.9425.jpg
├── dog.9426.jpg
├── dog.9427.jpg
├── dog.9428.jpg
├── dog.9429.jpg
├── dog.942.jpg
├── dog.9430.jpg
├── dog.9431.jpg
├── dog.9432.jpg
├── dog.9433.jpg
├── dog.9434.jpg
├── dog.9435.jpg
├── dog.9436.jpg
├── dog.9437.jpg
├── dog.9438.jpg
├── dog.9439.jpg
├── dog.943.jpg
├── dog.9440.jpg
├── dog.9441.jpg
├── dog.9442.jpg
├── dog.9443.jpg
├── dog.9444.jpg
├── dog.9445.jpg
├── dog.9446.jpg
├── dog.9447.jpg
├── dog.9448.jpg
├── dog.9449.jpg
├── dog.944.jpg
├── dog.9450.jpg
├── dog.9451.jpg
├── dog.9452.jpg
├── dog.9453.jpg
├── dog.9454.jpg
├── dog.9455.jpg
├── dog.9456.jpg
├── dog.9457.jpg
├── dog.9458.jpg
├── dog.9459.jpg
├── dog.945.jpg
├── dog.9460.jpg
├── dog.9461.jpg
├── dog.9462.jpg
├── dog.9463.jpg
├── dog.9464.jpg
├── dog.9465.jpg
├── dog.9466.jpg
├── dog.9467.jpg
├── dog.9468.jpg
├── dog.9469.jpg
├── dog.946.jpg
├── dog.9470.jpg
├── dog.9471.jpg
├── dog.9472.jpg
├── dog.9473.jpg
├── dog.9474.jpg
├── dog.9475.jpg
├── dog.9476.jpg
├── dog.9477.jpg
├── dog.9478.jpg
├── dog.9479.jpg
├── dog.947.jpg
├── dog.9480.jpg
├── dog.9481.jpg
├── dog.9482.jpg
├── dog.9483.jpg
├── dog.9484.jpg
├── dog.9485.jpg
├── dog.9486.jpg
├── dog.9487.jpg
├── dog.9488.jpg
├── dog.9489.jpg
├── dog.948.jpg
├── dog.9490.jpg
├── dog.9491.jpg
├── dog.9492.jpg
├── dog.9493.jpg
├── dog.9494.jpg
├── dog.9495.jpg
├── dog.9496.jpg
├── dog.9497.jpg
├── dog.9498.jpg
├── dog.9499.jpg
├── dog.949.jpg
├── dog.94.jpg
├── dog.9500.jpg
├── dog.9501.jpg
├── dog.9502.jpg
├── dog.9503.jpg
├── dog.9504.jpg
├── dog.9505.jpg
├── dog.9506.jpg
├── dog.9507.jpg
├── dog.9508.jpg
├── dog.9509.jpg
├── dog.950.jpg
├── dog.9510.jpg
├── dog.9511.jpg
├── dog.9512.jpg
├── dog.9513.jpg
├── dog.9514.jpg
├── dog.9515.jpg
├── dog.9516.jpg
├── dog.9517.jpg
├── dog.9518.jpg
├── dog.9519.jpg
├── dog.951.jpg
├── dog.9520.jpg
├── dog.9521.jpg
├── dog.9522.jpg
├── dog.9523.jpg
├── dog.9524.jpg
├── dog.9525.jpg
├── dog.9526.jpg
├── dog.9527.jpg
├── dog.9528.jpg
├── dog.9529.jpg
├── dog.952.jpg
├── dog.9530.jpg
├── dog.9531.jpg
├── dog.9532.jpg
├── dog.9533.jpg
├── dog.9534.jpg
├── dog.9535.jpg
├── dog.9536.jpg
├── dog.9537.jpg
├── dog.9538.jpg
├── dog.9539.jpg
├── dog.953.jpg
├── dog.9540.jpg
├── dog.9541.jpg
├── dog.9542.jpg
├── dog.9543.jpg
├── dog.9544.jpg
├── dog.9545.jpg
├── dog.9546.jpg
├── dog.9547.jpg
├── dog.9548.jpg
├── dog.9549.jpg
├── dog.954.jpg
├── dog.9550.jpg
├── dog.9551.jpg
├── dog.9552.jpg
├── dog.9553.jpg
├── dog.9554.jpg
├── dog.9555.jpg
├── dog.9556.jpg
├── dog.9557.jpg
├── dog.9558.jpg
├── dog.9559.jpg
├── dog.955.jpg
├── dog.9560.jpg
├── dog.9561.jpg
├── dog.9562.jpg
├── dog.9563.jpg
├── dog.9564.jpg
├── dog.9565.jpg
├── dog.9566.jpg
├── dog.9567.jpg
├── dog.9568.jpg
├── dog.9569.jpg
├── dog.956.jpg
├── dog.9570.jpg
├── dog.9571.jpg
├── dog.9572.jpg
├── dog.9573.jpg
├── dog.9574.jpg
├── dog.9575.jpg
├── dog.9576.jpg
├── dog.9577.jpg
├── dog.9578.jpg
├── dog.9579.jpg
├── dog.957.jpg
├── dog.9580.jpg
├── dog.9581.jpg
├── dog.9582.jpg
├── dog.9583.jpg
├── dog.9584.jpg
├── dog.9585.jpg
├── dog.9586.jpg
├── dog.9587.jpg
├── dog.9588.jpg
├── dog.9589.jpg
├── dog.958.jpg
├── dog.9590.jpg
├── dog.9591.jpg
├── dog.9592.jpg
├── dog.9593.jpg
├── dog.9594.jpg
├── dog.9595.jpg
├── dog.9596.jpg
├── dog.9597.jpg
├── dog.9598.jpg
├── dog.9599.jpg
├── dog.959.jpg
├── dog.95.jpg
├── dog.9600.jpg
├── dog.9601.jpg
├── dog.9602.jpg
├── dog.9603.jpg
├── dog.9604.jpg
├── dog.9605.jpg
├── dog.9606.jpg
├── dog.9607.jpg
├── dog.9608.jpg
├── dog.9609.jpg
├── dog.960.jpg
├── dog.9610.jpg
├── dog.9611.jpg
├── dog.9612.jpg
├── dog.9613.jpg
├── dog.9614.jpg
├── dog.9615.jpg
├── dog.9616.jpg
├── dog.9617.jpg
├── dog.9618.jpg
├── dog.9619.jpg
├── dog.961.jpg
├── dog.9620.jpg
├── dog.9621.jpg
├── dog.9622.jpg
├── dog.9623.jpg
├── dog.9624.jpg
├── dog.9625.jpg
├── dog.9626.jpg
├── dog.9627.jpg
├── dog.9628.jpg
├── dog.9629.jpg
├── dog.962.jpg
├── dog.9630.jpg
├── dog.9631.jpg
├── dog.9632.jpg
├── dog.9633.jpg
├── dog.9634.jpg
├── dog.9635.jpg
├── dog.9636.jpg
├── dog.9637.jpg
├── dog.9638.jpg
├── dog.9639.jpg
├── dog.963.jpg
├── dog.9640.jpg
├── dog.9641.jpg
├── dog.9642.jpg
├── dog.9643.jpg
├── dog.9644.jpg
├── dog.9645.jpg
├── dog.9646.jpg
├── dog.9647.jpg
├── dog.9648.jpg
├── dog.9649.jpg
├── dog.964.jpg
├── dog.9650.jpg
├── dog.9651.jpg
├── dog.9652.jpg
├── dog.9653.jpg
├── dog.9654.jpg
├── dog.9655.jpg
├── dog.9656.jpg
├── dog.9657.jpg
├── dog.9658.jpg
├── dog.9659.jpg
├── dog.965.jpg
├── dog.9660.jpg
├── dog.9661.jpg
├── dog.9662.jpg
├── dog.9663.jpg
├── dog.9664.jpg
├── dog.9665.jpg
├── dog.9666.jpg
├── dog.9667.jpg
├── dog.9668.jpg
├── dog.9669.jpg
├── dog.966.jpg
├── dog.9670.jpg
├── dog.9671.jpg
├── dog.9672.jpg
├── dog.9673.jpg
├── dog.9674.jpg
├── dog.9675.jpg
├── dog.9676.jpg
├── dog.9677.jpg
├── dog.9678.jpg
├── dog.9679.jpg
├── dog.967.jpg
├── dog.9680.jpg
├── dog.9681.jpg
├── dog.9682.jpg
├── dog.9683.jpg
├── dog.9684.jpg
├── dog.9685.jpg
├── dog.9686.jpg
├── dog.9687.jpg
├── dog.9688.jpg
├── dog.9689.jpg
├── dog.968.jpg
├── dog.9690.jpg
├── dog.9691.jpg
├── dog.9692.jpg
├── dog.9693.jpg
├── dog.9694.jpg
├── dog.9695.jpg
├── dog.9696.jpg
├── dog.9697.jpg
├── dog.9698.jpg
├── dog.9699.jpg
├── dog.969.jpg
├── dog.96.jpg
├── dog.9700.jpg
├── dog.9701.jpg
├── dog.9702.jpg
├── dog.9703.jpg
├── dog.9704.jpg
├── dog.9705.jpg
├── dog.9706.jpg
├── dog.9707.jpg
├── dog.9708.jpg
├── dog.9709.jpg
├── dog.970.jpg
├── dog.9710.jpg
├── dog.9711.jpg
├── dog.9712.jpg
├── dog.9713.jpg
├── dog.9714.jpg
├── dog.9715.jpg
├── dog.9716.jpg
├── dog.9717.jpg
├── dog.9718.jpg
├── dog.9719.jpg
├── dog.971.jpg
├── dog.9720.jpg
├── dog.9721.jpg
├── dog.9722.jpg
├── dog.9723.jpg
├── dog.9724.jpg
├── dog.9725.jpg
├── dog.9726.jpg
├── dog.9727.jpg
├── dog.9728.jpg
├── dog.9729.jpg
├── dog.972.jpg
├── dog.9730.jpg
├── dog.9731.jpg
├── dog.9732.jpg
├── dog.9733.jpg
├── dog.9734.jpg
├── dog.9735.jpg
├── dog.9736.jpg
├── dog.9737.jpg
├── dog.9738.jpg
├── dog.9739.jpg
├── dog.973.jpg
├── dog.9740.jpg
├── dog.9741.jpg
├── dog.9742.jpg
├── dog.9743.jpg
├── dog.9744.jpg
├── dog.9745.jpg
├── dog.9746.jpg
├── dog.9747.jpg
├── dog.9748.jpg
├── dog.9749.jpg
├── dog.974.jpg
├── dog.9750.jpg
├── dog.9751.jpg
├── dog.9752.jpg
├── dog.9753.jpg
├── dog.9754.jpg
├── dog.9755.jpg
├── dog.9756.jpg
├── dog.9757.jpg
├── dog.9758.jpg
├── dog.9759.jpg
├── dog.975.jpg
├── dog.9760.jpg
├── dog.9761.jpg
├── dog.9762.jpg
├── dog.9763.jpg
├── dog.9764.jpg
├── dog.9765.jpg
├── dog.9766.jpg
├── dog.9767.jpg
├── dog.9768.jpg
├── dog.9769.jpg
├── dog.976.jpg
├── dog.9770.jpg
├── dog.9771.jpg
├── dog.9772.jpg
├── dog.9773.jpg
├── dog.9774.jpg
├── dog.9775.jpg
├── dog.9776.jpg
├── dog.9777.jpg
├── dog.9778.jpg
├── dog.9779.jpg
├── dog.977.jpg
├── dog.9780.jpg
├── dog.9781.jpg
├── dog.9782.jpg
├── dog.9783.jpg
├── dog.9784.jpg
├── dog.9785.jpg
├── dog.9786.jpg
├── dog.9787.jpg
├── dog.9788.jpg
├── dog.9789.jpg
├── dog.978.jpg
├── dog.9790.jpg
├── dog.9791.jpg
├── dog.9792.jpg
├── dog.9793.jpg
├── dog.9794.jpg
├── dog.9795.jpg
├── dog.9796.jpg
├── dog.9797.jpg
├── dog.9798.jpg
├── dog.9799.jpg
├── dog.979.jpg
├── dog.97.jpg
├── dog.9800.jpg
├── dog.9801.jpg
├── dog.9802.jpg
├── dog.9803.jpg
├── dog.9804.jpg
├── dog.9805.jpg
├── dog.9806.jpg
├── dog.9807.jpg
├── dog.9808.jpg
├── dog.9809.jpg
├── dog.980.jpg
├── dog.9810.jpg
├── dog.9811.jpg
├── dog.9812.jpg
├── dog.9813.jpg
├── dog.9814.jpg
├── dog.9815.jpg
├── dog.9816.jpg
├── dog.9817.jpg
├── dog.9818.jpg
├── dog.9819.jpg
├── dog.981.jpg
├── dog.9820.jpg
├── dog.9821.jpg
├── dog.9822.jpg
├── dog.9823.jpg
├── dog.9824.jpg
├── dog.9825.jpg
├── dog.9826.jpg
├── dog.9827.jpg
├── dog.9828.jpg
├── dog.9829.jpg
├── dog.982.jpg
├── dog.9830.jpg
├── dog.9831.jpg
├── dog.9832.jpg
├── dog.9833.jpg
├── dog.9834.jpg
├── dog.9835.jpg
├── dog.9836.jpg
├── dog.9837.jpg
├── dog.9838.jpg
├── dog.9839.jpg
├── dog.983.jpg
├── dog.9840.jpg
├── dog.9841.jpg
├── dog.9842.jpg
├── dog.9843.jpg
├── dog.9844.jpg
├── dog.9845.jpg
├── dog.9846.jpg
├── dog.9847.jpg
├── dog.9848.jpg
├── dog.9849.jpg
├── dog.984.jpg
├── dog.9850.jpg
├── dog.9851.jpg
├── dog.9852.jpg
├── dog.9853.jpg
├── dog.9854.jpg
├── dog.9855.jpg
├── dog.9856.jpg
├── dog.9857.jpg
├── dog.9858.jpg
├── dog.9859.jpg
├── dog.985.jpg
├── dog.9860.jpg
├── dog.9861.jpg
├── dog.9862.jpg
├── dog.9863.jpg
├── dog.9864.jpg
├── dog.9865.jpg
├── dog.9866.jpg
├── dog.9867.jpg
├── dog.9868.jpg
├── dog.9869.jpg
├── dog.986.jpg
├── dog.9870.jpg
├── dog.9871.jpg
├── dog.9872.jpg
├── dog.9873.jpg
├── dog.9874.jpg
├── dog.9875.jpg
├── dog.9876.jpg
├── dog.9877.jpg
├── dog.9878.jpg
├── dog.9879.jpg
├── dog.987.jpg
├── dog.9880.jpg
├── dog.9881.jpg
├── dog.9882.jpg
├── dog.9883.jpg
├── dog.9884.jpg
├── dog.9885.jpg
├── dog.9886.jpg
├── dog.9887.jpg
├── dog.9888.jpg
├── dog.9889.jpg
├── dog.988.jpg
├── dog.9890.jpg
├── dog.9891.jpg
├── dog.9892.jpg
├── dog.9893.jpg
├── dog.9894.jpg
├── dog.9895.jpg
├── dog.9896.jpg
├── dog.9897.jpg
├── dog.9898.jpg
├── dog.9899.jpg
├── dog.989.jpg
├── dog.98.jpg
├── dog.9900.jpg
├── dog.9901.jpg
├── dog.9902.jpg
├── dog.9903.jpg
├── dog.9904.jpg
├── dog.9905.jpg
├── dog.9906.jpg
├── dog.9907.jpg
├── dog.9908.jpg
├── dog.9909.jpg
├── dog.990.jpg
├── dog.9910.jpg
├── dog.9911.jpg
├── dog.9912.jpg
├── dog.9913.jpg
├── dog.9914.jpg
├── dog.9915.jpg
├── dog.9916.jpg
├── dog.9917.jpg
├── dog.9918.jpg
├── dog.9919.jpg
├── dog.991.jpg
├── dog.9920.jpg
├── dog.9921.jpg
├── dog.9922.jpg
├── dog.9923.jpg
├── dog.9924.jpg
├── dog.9925.jpg
├── dog.9926.jpg
├── dog.9927.jpg
├── dog.9928.jpg
├── dog.9929.jpg
├── dog.992.jpg
├── dog.9930.jpg
├── dog.9931.jpg
├── dog.9932.jpg
├── dog.9933.jpg
├── dog.9934.jpg
├── dog.9935.jpg
├── dog.9936.jpg
├── dog.9937.jpg
├── dog.9938.jpg
├── dog.9939.jpg
├── dog.993.jpg
├── dog.9940.jpg
├── dog.9941.jpg
├── dog.9942.jpg
├── dog.9943.jpg
├── dog.9944.jpg
├── dog.9945.jpg
├── dog.9946.jpg
├── dog.9947.jpg
├── dog.9948.jpg
├── dog.9949.jpg
├── dog.994.jpg
├── dog.9950.jpg
├── dog.9951.jpg
├── dog.9952.jpg
├── dog.9953.jpg
├── dog.9954.jpg
├── dog.9955.jpg
├── dog.9956.jpg
├── dog.9957.jpg
├── dog.9958.jpg
├── dog.9959.jpg
├── dog.995.jpg
├── dog.9960.jpg
├── dog.9961.jpg
├── dog.9962.jpg
├── dog.9963.jpg
├── dog.9964.jpg
├── dog.9965.jpg
├── dog.9966.jpg
├── dog.9967.jpg
├── dog.9968.jpg
├── dog.9969.jpg
├── dog.996.jpg
├── dog.9970.jpg
├── dog.9971.jpg
├── dog.9972.jpg
├── dog.9973.jpg
├── dog.9974.jpg
├── dog.9975.jpg
├── dog.9976.jpg
├── dog.9977.jpg
├── dog.9978.jpg
├── dog.9979.jpg
├── dog.997.jpg
├── dog.9980.jpg
├── dog.9981.jpg
├── dog.9982.jpg
├── dog.9983.jpg
├── dog.9984.jpg
├── dog.9985.jpg
├── dog.9986.jpg
├── dog.9987.jpg
├── dog.9988.jpg
├── dog.9989.jpg
├── dog.998.jpg
├── dog.9990.jpg
├── dog.9991.jpg
├── dog.9992.jpg
├── dog.9993.jpg
├── dog.9994.jpg
├── dog.9995.jpg
├── dog.9996.jpg
├── dog.9997.jpg
├── dog.9998.jpg
├── dog.9999.jpg
├── dog.999.jpg
├── dog.99.jpg
└── dog.9.jpg

0 directories, 25000 files
original_dir = pathlib.Path("train")
new_base_dir = pathlib.Path("cats_vs_dogs_small")

def make_subset(subset_name, start_index, end_index):
    for category in ("cat", "dog"):
        dir = new_base_dir / subset_name / category
        os.makedirs(dir)
        fnames = [f"{category}.{i}.jpg" for i in range(start_index, end_index)]
        for fname in fnames:
            shutil.copyfile(src=original_dir / fname,
                            dst=dir / fname)

make_subset("train", start_index=0, end_index=1000)
make_subset("validation", start_index=1000, end_index=1500)
make_subset("test", start_index=1500, end_index=2500)
!tree cats_vs_dogs_small -L 2
cats_vs_dogs_small
├── test
│   ├── cat
│   └── dog
├── train
│   ├── cat
│   └── dog
└── validation
    ├── cat
    └── dog

9 directories, 0 files

We now have 2,000 training images, 1,000 validation images, and 2,000 test images. Each split contains the same number of samples from each class: this is a balanced binary-classification problem, which means classification accuracy will be an appropriate measure of success.

E.3.3 Building the model

The convnet will be a stack of alternated Conv2D (with relu activation) and MaxPool2D layers. But because we’re dealing with bigger images and a more complex problem, we’ll make our model larger, accordingly: it will have two more Conv2D and MaxPool2D stages. This serves both to augment the capacity of the model and to further reduce the size of the feature maps so they aren’t overly large when we reach the Flatten layer.

Here, because we start from inputs of size 180 pixels × 180 pixels, we end up with feature maps of size 7 × 7 just before the Flatten layer. Because we’re looking at a binary-classification problem, we’ll end the model with a single unit (a Dense layer of size 1) and a sigmoid activation. This unit will encode the probability that the model is looking at one class or the other.

learn = None
model = None
gc.collect()
torch.cuda.empty_cache()
class CustomModel(nn.Module):
    def __init__(self):
        super(CustomModel, self).__init__()
        self.conv1 = nn.Sequential(
            nn.Conv2d(3, 32, kernel_size=3, padding=0),
            nn.ReLU(),
            nn.MaxPool2d(kernel_size=2, stride=2)
        )
        self.conv2 = nn.Sequential(
            nn.Conv2d(32, 64, kernel_size=3, padding=0),
            nn.ReLU(),
            nn.MaxPool2d(kernel_size=2, stride=2)
        )
        self.conv3 = nn.Sequential(
            nn.Conv2d(64, 128, kernel_size=3, padding=0),
            nn.ReLU(),
            nn.MaxPool2d(kernel_size=2, stride=2)
        )
        self.conv4 = nn.Sequential(
            nn.Conv2d(128, 256, kernel_size=3, padding=0),
            nn.ReLU(),
            nn.MaxPool2d(kernel_size=2, stride=2)
        )
        self.conv5 = nn.Sequential(
            nn.Conv2d(256, 256, kernel_size=3, padding=0),
            nn.ReLU()
        )
        self.flatten = nn.Flatten()
        self.fc = nn.Sequential(
            nn.Linear(7 * 7 * 256, 1),
            nn.Sigmoid()
        )

    def forward(self, x):
        x = self.conv1(x)
        x = self.conv2(x)
        x = self.conv3(x)
        x = self.conv4(x)
        x = self.conv5(x)
        x = self.flatten(x)
        x = self.fc(x)
        return x

model = CustomModel()
summary(model, input_size=(32, 3, 180, 180))
/usr/local/lib/python3.10/dist-packages/torchinfo/torchinfo.py:477: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly.  To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
  action_fn=lambda data: sys.getsizeof(data.storage()),
/usr/local/lib/python3.10/dist-packages/torch/storage.py:665: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly.  To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
  return super().__sizeof__() + self.nbytes()
==========================================================================================
Layer (type:depth-idx)                   Output Shape              Param #
==========================================================================================
CustomModel                              [32, 1]                   --
├─Sequential: 1-1                        [32, 32, 89, 89]          --
│    └─Conv2d: 2-1                       [32, 32, 178, 178]        896
│    └─ReLU: 2-2                         [32, 32, 178, 178]        --
│    └─MaxPool2d: 2-3                    [32, 32, 89, 89]          --
├─Sequential: 1-2                        [32, 64, 43, 43]          --
│    └─Conv2d: 2-4                       [32, 64, 87, 87]          18,496
│    └─ReLU: 2-5                         [32, 64, 87, 87]          --
│    └─MaxPool2d: 2-6                    [32, 64, 43, 43]          --
├─Sequential: 1-3                        [32, 128, 20, 20]         --
│    └─Conv2d: 2-7                       [32, 128, 41, 41]         73,856
│    └─ReLU: 2-8                         [32, 128, 41, 41]         --
│    └─MaxPool2d: 2-9                    [32, 128, 20, 20]         --
├─Sequential: 1-4                        [32, 256, 9, 9]           --
│    └─Conv2d: 2-10                      [32, 256, 18, 18]         295,168
│    └─ReLU: 2-11                        [32, 256, 18, 18]         --
│    └─MaxPool2d: 2-12                   [32, 256, 9, 9]           --
├─Sequential: 1-5                        [32, 256, 7, 7]           --
│    └─Conv2d: 2-13                      [32, 256, 7, 7]           590,080
│    └─ReLU: 2-14                        [32, 256, 7, 7]           --
├─Flatten: 1-6                           [32, 12544]               --
├─Sequential: 1-7                        [32, 1]                   --
│    └─Linear: 2-15                      [32, 1]                   12,545
│    └─Sigmoid: 2-16                     [32, 1]                   --
==========================================================================================
Total params: 991,041
Trainable params: 991,041
Non-trainable params: 0
Total mult-adds (G): 13.35
==========================================================================================
Input size (MB): 12.44
Forward/backward pass size (MB): 463.09
Params size (MB): 3.96
Estimated Total Size (MB): 479.50
==========================================================================================

E.3.4 Data preprocessing

As you know by now, data should be formatted into appropriately preprocessed floatingpoint tensors before being fed into the model. Currently, the data sits on a drive as JPEG files, so the steps for getting it into the model are roughly as follows:

  1. Read the picture files.
  2. Decode the JPEG content to RGB grids of pixels.
  3. Convert these into floating-point tensors.
  4. Resize them to a shared size (we’ll use 180 × 180).
  5. Pack them into batches (we’ll use batches of 32 images).

It may seem a bit daunting, but fortunately Pytorch has utilities to take care of these steps automatically. In particular, Pytorch features the utility function datasets.ImageFolder(), which lets you quickly set up a data pipeline that can automatically turn image files on disk into batches of preprocessed tensors. This is what we’ll use here.

Calling datasets.ImageFolder() will first list the subdirectories of directory and assume each one contains images from one of our classes. It will then index the image files in each subdirectory. Finally, it will create and return a DataLoader object configured to read these files, shuffle them, decode them to tensors, resize them to a shared size, and pack them into batches.

image_size = (180, 180)
batch_size = 32
#new_base_dir = pathlib.Path("cats_vs_dogs_small")
transform = transforms.Compose([
    transforms.Resize(image_size),
    transforms.ToTensor(),
    transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))
])

# Create the ImageFolder datasets
train_dataset = datasets.ImageFolder(os.path.join(new_base_dir, "train"), transform=transform)
validation_dataset = datasets.ImageFolder(os.path.join(new_base_dir, "validation"), transform=transform)
test_dataset = datasets.ImageFolder(os.path.join(new_base_dir, "test"), transform=transform)

# Create the DataLoaders
train_loader = DataLoader(train_dataset, batch_size=batch_size, shuffle=True, num_workers=1)
validation_loader = DataLoader(validation_dataset, batch_size=batch_size, shuffle=False, num_workers=1)
test_loader = DataLoader(test_dataset, batch_size=batch_size, shuffle=False, num_workers=1)

Let’s look at the output of one of these Dataset objects: it yields batches of 180 × 180 RGB images (shape (32, 3, 180, 180)) and integer labels (shape (32,)). There are 32 samples in each batch (the batch size).

for data_batch, labels_batch in train_loader:
    print("data batch shape:", data_batch.shape)
    print("labels batch shape:", labels_batch.shape)
    break
data batch shape: torch.Size([32, 3, 180, 180])
labels batch shape: torch.Size([32])

E.3.5 Fitting the model

Let’s fit the model on our dataset. We’ll use the validation_data argument in fit() to monitor validation metrics on a separate Dataset object.

def custom_binary_cross_entropy(output, target):
    batch_size = output.shape[0]  # Get the current batch size
    return F.binary_cross_entropy(output, target.reshape(batch_size, 1).float())

def binary_accuracy(output, target):
    preds = (output > 0.5).float()
    return (preds == target.float()).float().mean()

data = DataLoaders(train_loader, validation_loader)
learn = Learner(data, model, loss_func=custom_binary_cross_entropy, opt_func=Adam, metrics=[binary_accuracy])
learn.fit_one_cycle(30, 0.001)
epoch train_loss valid_loss binary_accuracy time
0 0.692652 0.691543 0.602000 00:12
1 0.688626 0.695021 0.500000 00:12
2 0.689949 0.685752 0.545750 00:11
3 0.686162 0.677458 0.581500 00:12
4 0.661260 0.626807 0.625250 00:12
5 0.620587 0.633627 0.641000 00:11
6 0.593634 0.615299 0.652250 00:13
7 0.571257 0.635371 0.680250 00:11
8 0.524003 0.601736 0.666000 00:11
9 0.495714 0.538874 0.718000 00:11
10 0.465912 0.536926 0.725750 00:12
11 0.395629 0.576634 0.744000 00:12
12 0.340584 0.596848 0.758250 00:11
13 0.271378 0.574540 0.753750 00:12
14 0.205846 0.678262 0.749000 00:11
15 0.149247 0.692949 0.752000 00:12
16 0.106399 0.840231 0.768250 00:12
17 0.052970 1.088746 0.776500 00:12
18 0.024501 1.264751 0.772750 00:12
19 0.011347 1.528437 0.763750 00:13
20 0.004973 1.978761 0.771500 00:12
21 0.002348 2.146611 0.775250 00:12
22 0.001133 2.257327 0.776750 00:12
23 0.000648 2.379216 0.780750 00:12
24 0.000421 2.409948 0.780750 00:11
25 0.000328 2.435015 0.776750 00:11
26 0.000278 2.454705 0.777750 00:12
27 0.000243 2.463802 0.781750 00:11
28 0.000224 2.467732 0.780750 00:13
29 0.000214 2.468883 0.780750 00:15

Let’s plot the loss of the model over the training and validation data during training

learn.recorder.plot_loss()

These plots are characteristic of overfitting. Let’s check the test accuracy. We’ll reload the model from its saved file to evaluate it as it was before it started overfitting.

fastai_loss, fastai_accuracy = learn.validate(dl=test_loader)
fastai_accuracy
0.7599999904632568

We get a test accuracy of about 75%. Because we have relatively few training samples (2,000), overfitting will be our number one concern. You already know about a number of techniques that can help mitigate overfitting, such as dropout and weight decay (L2 regularization). We’re now going to work with a new one, specific to computer vision and used almost universally when processing images with deep learning models: data augmentation.

E.3.6 Using data augmentation

Overfitting is caused by having too few samples to learn from, rendering you unable to train a model that can generalize to new data. Given infinite data, your model would be exposed to every possible aspect of the data distribution at hand: you would never overfit. Data augmentation takes the approach of generating more training data from existing training samples by augmenting the samples via a number of random transformations that yield believable-looking images.

The goal is that, at training time, your model will never see the exact same picture twice. This helps expose the model to more aspects of the data so it can generalize better. In Keras, this can be done by adding a number of data augmentation layers at the start of your model. Let’s get started with an example: the following Sequential model chains several random image transformations. In our model, we’d include it right before the Rescaling layer.

learn = None
model = None
gc.collect()
torch.cuda.empty_cache()
data_augmentation = transforms.Compose([
    transforms.Resize(image_size),
    transforms.RandomHorizontalFlip(),
    transforms.RandomRotation(36),
    transforms.RandomAffine(degrees=0, translate=None, scale=(0.8, 1.2), shear=None),
    #transforms.ToTensor(),
    transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))
])
plt.figure(figsize=(10, 10))

for images, _ in train_loader:
    augmented_images = data_augmentation(images)
    for i in range(9):
        ax = plt.subplot(3, 3, i + 1)
        image_to_plot = augmented_images[i].numpy().transpose((1, 2, 0))
        image_to_plot = (image_to_plot - image_to_plot.min())/(image_to_plot.max()-image_to_plot.min()) # Normalize to [0..1] range
        plt.imshow(image_to_plot)
        plt.axis("off")
    break  # Sample only 1 batch from the dataset
plt.show()

On the other hand, we can also use albumentations for data augmentation:

image_size = (180, 180)
batch_size = 32

class CustomImageFolder(datasets.ImageFolder):
    def __getitem__(self, index):
        path, target = self.samples[index]
        sample = self.loader(path)
        sample = np.array(sample)

        if self.transform is not None:
            transformed = self.transform(image=sample)
            sample = transformed["image"]

        if self.target_transform is not None:
            target = self.target_transform(target)

        return sample, target

train_transform = A.Compose(
    [
        A.Resize(image_size[0], image_size[1]),
        A.ShiftScaleRotate(shift_limit=0.05, scale_limit=0.2, rotate_limit=36, p=0.5),
        A.HorizontalFlip(p=0.5),
        A.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5)),
        ToTensorV2(),
    ]
)

val_transform = A.Compose(
    [
        A.Resize(image_size[0], image_size[1]),
        A.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5)),
        ToTensorV2(),
    ]
)

# Create the ImageFolder datasets
train_dataset = CustomImageFolder(os.path.join(new_base_dir, "train"), transform=train_transform)
validation_dataset = CustomImageFolder(os.path.join(new_base_dir, "validation"), transform=val_transform)
test_dataset = CustomImageFolder(os.path.join(new_base_dir, "test"), transform=val_transform)

# Create the DataLoaders
train_loader = DataLoader(train_dataset, batch_size=batch_size, shuffle=True, num_workers=2)
validation_loader = DataLoader(validation_dataset, batch_size=batch_size, shuffle=False, num_workers=2)
test_loader = DataLoader(test_dataset, batch_size=batch_size, shuffle=False, num_workers=2)
def visualize_augmentations(dataset, idx=0, samples=10, cols=5):
    dataset = copy.deepcopy(dataset)
    dataset.transform = A.Compose([t for t in dataset.transform if not isinstance(t, (A.Normalize, ToTensorV2))])
    rows = samples // cols
    figure, ax = plt.subplots(nrows=rows, ncols=cols, figsize=(12, 6))
    for i in range(samples):
        image, _ = dataset[idx]
        ax.ravel()[i].imshow(image)
        ax.ravel()[i].set_axis_off()
    plt.tight_layout()
    plt.show()
visualize_augmentations(train_dataset)

If we train a new model using this data-augmentation configuration, the model will never see the same input twice. But the inputs it sees are still heavily intercorrelated because they come from a small number of original images—we can’t produce new information; we can only remix existing information. As such, this may not be enough to completely get rid of overfitting. To further fight overfitting, we’ll also add a Dropout layer to our model right before the densely connected classifier.

One last thing you should know about random image augmentation layers: just like Dropout, they’re inactive during inference (when we call predict() or evaluate()). During evaluation, our model will behave just the same as when it did not include data augmentation and dropout.

class CustomModel(nn.Module):
    def __init__(self):
        super(CustomModel, self).__init__()
        self.conv1 = nn.Sequential(
            nn.Conv2d(3, 32, kernel_size=3, padding=0),
            nn.ReLU(),
            nn.MaxPool2d(kernel_size=2, stride=2)
        )
        self.conv2 = nn.Sequential(
            nn.Conv2d(32, 64, kernel_size=3, padding=0),
            nn.ReLU(),
            nn.MaxPool2d(kernel_size=2, stride=2)
        )
        self.conv3 = nn.Sequential(
            nn.Conv2d(64, 128, kernel_size=3, padding=0),
            nn.ReLU(),
            nn.MaxPool2d(kernel_size=2, stride=2)
        )
        self.conv4 = nn.Sequential(
            nn.Conv2d(128, 256, kernel_size=3, padding=0),
            nn.ReLU(),
            nn.MaxPool2d(kernel_size=2, stride=2)
        )
        self.conv5 = nn.Sequential(
            nn.Conv2d(256, 256, kernel_size=3, padding=0),
            nn.ReLU()
        )
        self.flatten = nn.Flatten()
        self.fc = nn.Sequential(
            nn.Linear(7 * 7 * 256, 1),
            nn.Sigmoid()
        )

    def forward(self, x):
        x = self.conv1(x)
        x = self.conv2(x)
        x = self.conv3(x)
        x = self.conv4(x)
        x = self.conv5(x)
        x = self.flatten(x)
        x = self.fc(x)
        return x

model = CustomModel()
def custom_binary_cross_entropy(output, target):
    batch_size = output.shape[0]  # Get the current batch size
    return F.binary_cross_entropy(output, target.reshape(batch_size, 1).float())

def binary_accuracy(output, target):
    preds = (output > 0.5).float()
    return (preds == target.float()).float().mean()

data = DataLoaders(train_loader, validation_loader)
learn = Learner(data, model, loss_func=custom_binary_cross_entropy, opt_func=Adam, metrics=[binary_accuracy])

The training loop is slow here, you could try to directly use Pytorch instead, refere to https://albumentations.ai/docs/examples/pytorch_classification/ for more information.

learn.fit_one_cycle(60, 0.001)
epoch train_loss valid_loss binary_accuracy time
0 0.693066 0.691918 0.500000 00:38
1 0.690652 0.686164 0.600750 00:39
2 0.678950 0.651391 0.617250 00:38
3 0.656565 0.629399 0.637000 00:38
4 0.642592 0.623475 0.666500 00:38
5 0.616005 0.580928 0.689500 00:38
6 0.589917 0.570550 0.698000 00:38
7 0.573144 0.546748 0.719500 00:37
8 0.556366 0.601920 0.684500 00:37
9 0.541922 0.553274 0.697250 00:37
10 0.523928 0.503581 0.752000 00:38
11 0.516340 0.497255 0.762500 00:38
12 0.486024 0.571045 0.719500 00:37
13 0.470749 0.485204 0.747000 00:37
14 0.441717 0.454703 0.780000 00:37
15 0.411646 0.498344 0.762500 00:37
16 0.402160 0.495557 0.765000 00:37
17 0.396506 0.473726 0.783250 00:37
18 0.381660 0.490383 0.787750 00:38
19 0.360372 0.457263 0.787500 00:37
20 0.342842 0.465778 0.780750 00:37
21 0.328554 0.484067 0.793500 00:37
22 0.314617 0.468121 0.786250 00:37
23 0.301147 0.415992 0.799500 00:38
24 0.278657 0.434241 0.811750 00:37
25 0.247953 0.449205 0.810000 00:37
26 0.235257 0.427547 0.819500 00:37
27 0.233322 0.455889 0.808000 00:37
28 0.208548 0.445234 0.805500 00:38
29 0.201965 0.496131 0.821750 00:38
30 0.187354 0.470819 0.830750 00:39
31 0.168621 0.463303 0.827000 00:39
32 0.159545 0.659526 0.794000 00:39
33 0.162341 0.525368 0.817750 00:39
34 0.149857 0.459367 0.832000 00:39
35 0.141797 0.449607 0.841250 00:38
36 0.137166 0.500571 0.822750 00:39
37 0.124995 0.418119 0.842000 00:41
38 0.111028 0.487745 0.842250 00:39
39 0.096113 0.461577 0.846000 00:37
40 0.076184 0.517702 0.849500 00:37
41 0.083818 0.532772 0.843250 00:38
42 0.070036 0.582350 0.837250 00:37
43 0.070664 0.603945 0.841250 00:37
44 0.084810 0.591764 0.850250 00:37
45 0.078341 0.488943 0.852000 00:38
46 0.065318 0.624858 0.854250 00:38
47 0.065456 0.477966 0.855750 00:37
48 0.061500 0.486319 0.858500 00:37
49 0.054900 0.489795 0.855750 00:37
50 0.057763 0.487532 0.853750 00:38
51 0.045038 0.497306 0.853750 00:38
52 0.038785 0.498204 0.864750 00:37
53 0.043548 0.488389 0.858750 00:37
54 0.038572 0.493007 0.859750 00:37
55 0.037272 0.501999 0.857500 00:38
56 0.049886 0.497700 0.859750 00:38
57 0.039522 0.496305 0.859750 00:37
58 0.042081 0.498379 0.858750 00:37
59 0.038153 0.498430 0.859750 00:37

Let’s train the model using data augmentation and dropout. Because we expect overfitting to occur much later during training, we will train for three times as many epochs — 60.

Let’s plot the results again: Thanks to data augmentation and dropout, we start overfitting much later, around epochs 20-30 (compared to epoch 10 for the original model). The validation accuracy ends up consistently in the 80–85% range—a big improvement over our first try.

learn.recorder.plot_loss()

Let’s check the test accuracy.

fastai_loss, fastai_accuracy = learn.validate(dl=test_loader)
fastai_accuracy
0.8554999828338623

We get a test accuracy over 85%. It’s starting to look good! By further tuning the model’s configuration (such as the number of filters per convolution layer, or the number of layers in the model), we might be able to get an even better accuracy, likely up to 90%. But it would prove difficult to go any higher just by training our own convnet from scratch, because we have so little data to work with. As a next step to improve our accuracy on this problem, we’ll have to use a pretrained model as we will see later on.

E.4 Object detection and segmentation with detectorn2 (Optional)

In this section, we show how to train an existing detectron2 model on a custom dataset in a new format.

We use the balloon segmentation dataset which only has one class: balloon. We’ll train a balloon segmentation model from an existing model pre-trained on COCO dataset, available in detectron2’s model zoo.

Note that COCO dataset does not have the “balloon” category. We’ll be able to recognize this new class in a few minutes.

setup_logger()
<Logger detectron2 (DEBUG)>

E.4.1 Prepare the dataset

# download, decompress the data
!wget https://github.com/matterport/Mask_RCNN/releases/download/v2.1/balloon_dataset.zip
!unzip balloon_dataset.zip > /dev/null
--2023-04-30 06:53:44--  https://github.com/matterport/Mask_RCNN/releases/download/v2.1/balloon_dataset.zip
Resolving github.com (github.com)... 140.82.113.3
Connecting to github.com (github.com)|140.82.113.3|:443... connected.
HTTP request sent, awaiting response... 302 Found
Location: https://objects.githubusercontent.com/github-production-release-asset-2e65be/107595270/737339e2-2b83-11e8-856a-188034eb3468?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20230430%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20230430T065344Z&X-Amz-Expires=300&X-Amz-Signature=ea904acad73f8792f74d922b5b08311500fd5d72a41e8c6215e74e94069a3edf&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=107595270&response-content-disposition=attachment%3B%20filename%3Dballoon_dataset.zip&response-content-type=application%2Foctet-stream [following]
--2023-04-30 06:53:44--  https://objects.githubusercontent.com/github-production-release-asset-2e65be/107595270/737339e2-2b83-11e8-856a-188034eb3468?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20230430%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20230430T065344Z&X-Amz-Expires=300&X-Amz-Signature=ea904acad73f8792f74d922b5b08311500fd5d72a41e8c6215e74e94069a3edf&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=107595270&response-content-disposition=attachment%3B%20filename%3Dballoon_dataset.zip&response-content-type=application%2Foctet-stream
Resolving objects.githubusercontent.com (objects.githubusercontent.com)... 185.199.108.133, 185.199.111.133, 185.199.109.133, ...
Connecting to objects.githubusercontent.com (objects.githubusercontent.com)|185.199.108.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 38741381 (37M) [application/octet-stream]
Saving to: ‘balloon_dataset.zip’

balloon_dataset.zip 100%[===================>]  36.95M   145MB/s    in 0.3s    

2023-04-30 06:53:44 (145 MB/s) - ‘balloon_dataset.zip’ saved [38741381/38741381]

Register the balloon dataset to detectron2, following the detectron2 custom dataset tutorial. Here, the dataset is in its custom format, therefore we write a function to parse it and prepare it into detectron2’s standard format. User should write such a function when using a dataset in custom format. See the tutorial for more details.

# if your dataset is in COCO format, this cell can be replaced by the following three lines:
# from detectron2.data.datasets import register_coco_instances
# register_coco_instances("my_dataset_train", {}, "json_annotation_train.json", "path/to/image/dir")
# register_coco_instances("my_dataset_val", {}, "json_annotation_val.json", "path/to/image/dir")

def get_balloon_dicts(img_dir):
    json_file = os.path.join(img_dir, "via_region_data.json")
    with open(json_file) as f:
        imgs_anns = json.load(f)

    dataset_dicts = []
    for idx, v in enumerate(imgs_anns.values()):
        record = {}
        
        filename = os.path.join(img_dir, v["filename"])
        height, width = cv2.imread(filename).shape[:2]
        
        record["file_name"] = filename
        record["image_id"] = idx
        record["height"] = height
        record["width"] = width
      
        annos = v["regions"]
        objs = []
        for _, anno in annos.items():
            assert not anno["region_attributes"]
            anno = anno["shape_attributes"]
            px = anno["all_points_x"]
            py = anno["all_points_y"]
            poly = [(x + 0.5, y + 0.5) for x, y in zip(px, py)]
            poly = [p for x in poly for p in x]

            obj = {
                "bbox": [np.min(px), np.min(py), np.max(px), np.max(py)],
                "bbox_mode": BoxMode.XYXY_ABS,
                "segmentation": [poly],
                "category_id": 0,
            }
            objs.append(obj)
        record["annotations"] = objs
        dataset_dicts.append(record)
    return dataset_dicts

for d in ["train", "val"]:
    DatasetCatalog.register("balloon_" + d, lambda d=d: get_balloon_dicts("balloon/" + d))
    MetadataCatalog.get("balloon_" + d).set(thing_classes=["balloon"])
balloon_metadata = MetadataCatalog.get("balloon_train")

To verify the dataset is in correct format, let’s visualize the annotations of randomly selected samples in the training set:

dataset_dicts = get_balloon_dicts("balloon/train")
for d in random.sample(dataset_dicts, 3):
    img = cv2.imread(d["file_name"])
    visualizer = Visualizer(img[:, :, ::-1], metadata=balloon_metadata, scale=0.5)
    out = visualizer.draw_dataset_dict(d)
    cv2_imshow(out.get_image()[:, :, ::-1])
Output hidden; open in https://colab.research.google.com to view.

E.4.2 Train the model

Now, let’s fine-tune a COCO-pretrained R50-FPN Mask R-CNN model on the balloon dataset.

cfg = get_cfg()
cfg.merge_from_file(model_zoo.get_config_file("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml"))
cfg.DATASETS.TRAIN = ("balloon_train",)
cfg.DATASETS.TEST = ()
cfg.DATALOADER.NUM_WORKERS = 2
cfg.MODEL.WEIGHTS = model_zoo.get_checkpoint_url("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml")  # Let training initialize from model zoo
cfg.SOLVER.IMS_PER_BATCH = 2  # This is the real "batch size" commonly known to deep learning people
cfg.SOLVER.BASE_LR = 0.00025  # pick a good LR
cfg.SOLVER.MAX_ITER = 300    # 300 iterations seems good enough for this toy dataset; you will need to train longer for a practical dataset
cfg.SOLVER.STEPS = []        # do not decay learning rate
cfg.MODEL.ROI_HEADS.BATCH_SIZE_PER_IMAGE = 128   # The "RoIHead batch size". 128 is faster, and good enough for this toy dataset (default: 512)
cfg.MODEL.ROI_HEADS.NUM_CLASSES = 1  # only has one class (ballon). (see https://detectron2.readthedocs.io/tutorials/datasets.html#update-the-config-for-new-datasets)
# NOTE: this config means the number of classes, but a few popular unofficial tutorials incorrect uses num_classes+1 here.

os.makedirs(cfg.OUTPUT_DIR, exist_ok=True)
trainer = DefaultTrainer(cfg) 
trainer.resume_or_load(resume=False)
trainer.train()
[04/30 06:54:01 d2.engine.defaults]: Model:
GeneralizedRCNN(
  (backbone): FPN(
    (fpn_lateral2): Conv2d(256, 256, kernel_size=(1, 1), stride=(1, 1))
    (fpn_output2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (fpn_lateral3): Conv2d(512, 256, kernel_size=(1, 1), stride=(1, 1))
    (fpn_output3): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (fpn_lateral4): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1))
    (fpn_output4): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (fpn_lateral5): Conv2d(2048, 256, kernel_size=(1, 1), stride=(1, 1))
    (fpn_output5): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
    (top_block): LastLevelMaxPool()
    (bottom_up): ResNet(
      (stem): BasicStem(
        (conv1): Conv2d(
          3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False
          (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
        )
      )
      (res2): Sequential(
        (0): BottleneckBlock(
          (shortcut): Conv2d(
            64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv1): Conv2d(
            64, 64, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv2): Conv2d(
            64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv3): Conv2d(
            64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
        )
        (1): BottleneckBlock(
          (conv1): Conv2d(
            256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv2): Conv2d(
            64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv3): Conv2d(
            64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
        )
        (2): BottleneckBlock(
          (conv1): Conv2d(
            256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv2): Conv2d(
            64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
          )
          (conv3): Conv2d(
            64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
        )
      )
      (res3): Sequential(
        (0): BottleneckBlock(
          (shortcut): Conv2d(
            256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv1): Conv2d(
            256, 128, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv2): Conv2d(
            128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv3): Conv2d(
            128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
        )
        (1): BottleneckBlock(
          (conv1): Conv2d(
            512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv2): Conv2d(
            128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv3): Conv2d(
            128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
        )
        (2): BottleneckBlock(
          (conv1): Conv2d(
            512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv2): Conv2d(
            128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv3): Conv2d(
            128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
        )
        (3): BottleneckBlock(
          (conv1): Conv2d(
            512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv2): Conv2d(
            128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
          )
          (conv3): Conv2d(
            128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
        )
      )
      (res4): Sequential(
        (0): BottleneckBlock(
          (shortcut): Conv2d(
            512, 1024, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
          (conv1): Conv2d(
            512, 256, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
        (1): BottleneckBlock(
          (conv1): Conv2d(
            1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
        (2): BottleneckBlock(
          (conv1): Conv2d(
            1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
        (3): BottleneckBlock(
          (conv1): Conv2d(
            1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
        (4): BottleneckBlock(
          (conv1): Conv2d(
            1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
        (5): BottleneckBlock(
          (conv1): Conv2d(
            1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv2): Conv2d(
            256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
          )
          (conv3): Conv2d(
            256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
          )
        )
      )
      (res5): Sequential(
        (0): BottleneckBlock(
          (shortcut): Conv2d(
            1024, 2048, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
          )
          (conv1): Conv2d(
            1024, 512, kernel_size=(1, 1), stride=(2, 2), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv2): Conv2d(
            512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv3): Conv2d(
            512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
          )
        )
        (1): BottleneckBlock(
          (conv1): Conv2d(
            2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv2): Conv2d(
            512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv3): Conv2d(
            512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
          )
        )
        (2): BottleneckBlock(
          (conv1): Conv2d(
            2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv2): Conv2d(
            512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
          )
          (conv3): Conv2d(
            512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False
            (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
          )
        )
      )
    )
  )
  (proposal_generator): RPN(
    (rpn_head): StandardRPNHead(
      (conv): Conv2d(
        256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)
        (activation): ReLU()
      )
      (objectness_logits): Conv2d(256, 3, kernel_size=(1, 1), stride=(1, 1))
      (anchor_deltas): Conv2d(256, 12, kernel_size=(1, 1), stride=(1, 1))
    )
    (anchor_generator): DefaultAnchorGenerator(
      (cell_anchors): BufferList()
    )
  )
  (roi_heads): StandardROIHeads(
    (box_pooler): ROIPooler(
      (level_poolers): ModuleList(
        (0): ROIAlign(output_size=(7, 7), spatial_scale=0.25, sampling_ratio=0, aligned=True)
        (1): ROIAlign(output_size=(7, 7), spatial_scale=0.125, sampling_ratio=0, aligned=True)
        (2): ROIAlign(output_size=(7, 7), spatial_scale=0.0625, sampling_ratio=0, aligned=True)
        (3): ROIAlign(output_size=(7, 7), spatial_scale=0.03125, sampling_ratio=0, aligned=True)
      )
    )
    (box_head): FastRCNNConvFCHead(
      (flatten): Flatten(start_dim=1, end_dim=-1)
      (fc1): Linear(in_features=12544, out_features=1024, bias=True)
      (fc_relu1): ReLU()
      (fc2): Linear(in_features=1024, out_features=1024, bias=True)
      (fc_relu2): ReLU()
    )
    (box_predictor): FastRCNNOutputLayers(
      (cls_score): Linear(in_features=1024, out_features=2, bias=True)
      (bbox_pred): Linear(in_features=1024, out_features=4, bias=True)
    )
    (mask_pooler): ROIPooler(
      (level_poolers): ModuleList(
        (0): ROIAlign(output_size=(14, 14), spatial_scale=0.25, sampling_ratio=0, aligned=True)
        (1): ROIAlign(output_size=(14, 14), spatial_scale=0.125, sampling_ratio=0, aligned=True)
        (2): ROIAlign(output_size=(14, 14), spatial_scale=0.0625, sampling_ratio=0, aligned=True)
        (3): ROIAlign(output_size=(14, 14), spatial_scale=0.03125, sampling_ratio=0, aligned=True)
      )
    )
    (mask_head): MaskRCNNConvUpsampleHead(
      (mask_fcn1): Conv2d(
        256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)
        (activation): ReLU()
      )
      (mask_fcn2): Conv2d(
        256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)
        (activation): ReLU()
      )
      (mask_fcn3): Conv2d(
        256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)
        (activation): ReLU()
      )
      (mask_fcn4): Conv2d(
        256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)
        (activation): ReLU()
      )
      (deconv): ConvTranspose2d(256, 256, kernel_size=(2, 2), stride=(2, 2))
      (deconv_relu): ReLU()
      (predictor): Conv2d(256, 1, kernel_size=(1, 1), stride=(1, 1))
    )
  )
)
[04/30 06:54:03 d2.data.build]: Removed 0 images with no usable annotations. 61 images left.
[04/30 06:54:03 d2.data.build]: Distribution of instances among all 1 categories:
|  category  | #instances   |
|:----------:|:-------------|
|  balloon   | 255          |
|            |              |
[04/30 06:54:03 d2.data.dataset_mapper]: [DatasetMapper] Augmentations used in training: [ResizeShortestEdge(short_edge_length=(640, 672, 704, 736, 768, 800), max_size=1333, sample_style='choice'), RandomFlip()]
[04/30 06:54:03 d2.data.build]: Using training sampler TrainingSampler
[04/30 06:54:03 d2.data.common]: Serializing the dataset using: <class 'detectron2.data.common._TorchSerializedList'>
[04/30 06:54:03 d2.data.common]: Serializing 61 elements to byte tensors and concatenating them all ...
[04/30 06:54:03 d2.data.common]: Serialized dataset takes 0.17 MiB
[04/30 06:54:03 d2.checkpoint.detection_checkpoint]: [DetectionCheckpointer] Loading from https://dl.fbaipublicfiles.com/detectron2/COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x/137849600/model_final_f10217.pkl ...
model_final_f10217.pkl: 178MB [00:02, 75.3MB/s]                          
WARNING:fvcore.common.checkpoint:Skip loading parameter 'roi_heads.box_predictor.cls_score.weight' to the model due to incompatible shapes: (81, 1024) in the checkpoint but (2, 1024) in the model! You might want to double check if this is expected.
WARNING:fvcore.common.checkpoint:Skip loading parameter 'roi_heads.box_predictor.cls_score.bias' to the model due to incompatible shapes: (81,) in the checkpoint but (2,) in the model! You might want to double check if this is expected.
WARNING:fvcore.common.checkpoint:Skip loading parameter 'roi_heads.box_predictor.bbox_pred.weight' to the model due to incompatible shapes: (320, 1024) in the checkpoint but (4, 1024) in the model! You might want to double check if this is expected.
WARNING:fvcore.common.checkpoint:Skip loading parameter 'roi_heads.box_predictor.bbox_pred.bias' to the model due to incompatible shapes: (320,) in the checkpoint but (4,) in the model! You might want to double check if this is expected.
WARNING:fvcore.common.checkpoint:Skip loading parameter 'roi_heads.mask_head.predictor.weight' to the model due to incompatible shapes: (80, 256, 1, 1) in the checkpoint but (1, 256, 1, 1) in the model! You might want to double check if this is expected.
WARNING:fvcore.common.checkpoint:Skip loading parameter 'roi_heads.mask_head.predictor.bias' to the model due to incompatible shapes: (80,) in the checkpoint but (1,) in the model! You might want to double check if this is expected.
WARNING:fvcore.common.checkpoint:Some model parameters or buffers are not found in the checkpoint:
roi_heads.box_predictor.bbox_pred.{bias, weight}
roi_heads.box_predictor.cls_score.{bias, weight}
roi_heads.mask_head.predictor.{bias, weight}
[04/30 06:54:05 d2.engine.train_loop]: Starting training from iteration 0
/usr/local/lib/python3.10/dist-packages/torch/functional.py:504: UserWarning: torch.meshgrid: in an upcoming release, it will be required to pass the indexing argument. (Triggered internally at ../aten/src/ATen/native/TensorShape.cpp:3483.)
  return _VF.meshgrid(tensors, **kwargs)  # type: ignore[attr-defined]
[04/30 06:54:18 d2.utils.events]:  eta: 0:02:03  iter: 19  total_loss: 2.146  loss_cls: 0.7301  loss_box_reg: 0.7054  loss_mask: 0.6942  loss_rpn_cls: 0.03187  loss_rpn_loc: 0.00954    time: 0.4561  last_time: 0.4256  data_time: 0.0337  last_data_time: 0.0340   lr: 1.6068e-05  max_mem: 2455M
[04/30 06:54:30 d2.utils.events]:  eta: 0:01:55  iter: 39  total_loss: 1.767  loss_cls: 0.6171  loss_box_reg: 0.5889  loss_mask: 0.6012  loss_rpn_cls: 0.02424  loss_rpn_loc: 0.004564    time: 0.4497  last_time: 0.5170  data_time: 0.0092  last_data_time: 0.0233   lr: 3.2718e-05  max_mem: 2455M
[04/30 06:54:39 d2.utils.events]:  eta: 0:01:49  iter: 59  total_loss: 1.835  loss_cls: 0.5053  loss_box_reg: 0.7199  loss_mask: 0.4906  loss_rpn_cls: 0.04432  loss_rpn_loc: 0.01324    time: 0.4561  last_time: 0.4568  data_time: 0.0130  last_data_time: 0.0266   lr: 4.9367e-05  max_mem: 2456M
[04/30 06:54:51 d2.utils.events]:  eta: 0:01:43  iter: 79  total_loss: 1.449  loss_cls: 0.3847  loss_box_reg: 0.6427  loss_mask: 0.365  loss_rpn_cls: 0.02072  loss_rpn_loc: 0.007872    time: 0.4891  last_time: 0.4607  data_time: 0.0469  last_data_time: 0.0160   lr: 6.6017e-05  max_mem: 2574M
[04/30 06:55:00 d2.utils.events]:  eta: 0:01:32  iter: 99  total_loss: 1.237  loss_cls: 0.3137  loss_box_reg: 0.5958  loss_mask: 0.2853  loss_rpn_cls: 0.03939  loss_rpn_loc: 0.008389    time: 0.4855  last_time: 0.4037  data_time: 0.0119  last_data_time: 0.0043   lr: 8.2668e-05  max_mem: 2574M
[04/30 06:55:10 d2.utils.events]:  eta: 0:01:23  iter: 119  total_loss: 1.149  loss_cls: 0.2453  loss_box_reg: 0.6494  loss_mask: 0.2161  loss_rpn_cls: 0.01339  loss_rpn_loc: 0.005409    time: 0.4822  last_time: 0.4801  data_time: 0.0148  last_data_time: 0.0132   lr: 9.9318e-05  max_mem: 2612M
[04/30 06:55:20 d2.utils.events]:  eta: 0:01:14  iter: 139  total_loss: 1.032  loss_cls: 0.2096  loss_box_reg: 0.6288  loss_mask: 0.2124  loss_rpn_cls: 0.01758  loss_rpn_loc: 0.008476    time: 0.4860  last_time: 0.3769  data_time: 0.0170  last_data_time: 0.0038   lr: 0.00011597  max_mem: 2612M
[04/30 06:55:30 d2.utils.events]:  eta: 0:01:05  iter: 159  total_loss: 0.7927  loss_cls: 0.1471  loss_box_reg: 0.5167  loss_mask: 0.1499  loss_rpn_cls: 0.01585  loss_rpn_loc: 0.00623    time: 0.4867  last_time: 0.3468  data_time: 0.0214  last_data_time: 0.0058   lr: 0.00013262  max_mem: 2612M
[04/30 06:55:38 d2.utils.events]:  eta: 0:00:55  iter: 179  total_loss: 0.7589  loss_cls: 0.1219  loss_box_reg: 0.4812  loss_mask: 0.1269  loss_rpn_cls: 0.01871  loss_rpn_loc: 0.006172    time: 0.4814  last_time: 0.4045  data_time: 0.0121  last_data_time: 0.0099   lr: 0.00014927  max_mem: 2612M
[04/30 06:55:49 d2.utils.events]:  eta: 0:00:46  iter: 199  total_loss: 0.5453  loss_cls: 0.09887  loss_box_reg: 0.3309  loss_mask: 0.1008  loss_rpn_cls: 0.01691  loss_rpn_loc: 0.007972    time: 0.4842  last_time: 0.5318  data_time: 0.0193  last_data_time: 0.0350   lr: 0.00016592  max_mem: 2612M
[04/30 06:55:59 d2.utils.events]:  eta: 0:00:37  iter: 219  total_loss: 0.4851  loss_cls: 0.1042  loss_box_reg: 0.2436  loss_mask: 0.1031  loss_rpn_cls: 0.01303  loss_rpn_loc: 0.009417    time: 0.4850  last_time: 0.5056  data_time: 0.0196  last_data_time: 0.0212   lr: 0.00018257  max_mem: 2612M
[04/30 06:56:09 d2.utils.events]:  eta: 0:00:28  iter: 239  total_loss: 0.34  loss_cls: 0.06758  loss_box_reg: 0.1757  loss_mask: 0.07039  loss_rpn_cls: 0.01167  loss_rpn_loc: 0.006367    time: 0.4899  last_time: 0.7319  data_time: 0.0421  last_data_time: 0.1122   lr: 0.00019922  max_mem: 2612M
[04/30 06:56:19 d2.utils.events]:  eta: 0:00:18  iter: 259  total_loss: 0.4182  loss_cls: 0.08279  loss_box_reg: 0.1719  loss_mask: 0.1069  loss_rpn_cls: 0.0144  loss_rpn_loc: 0.01126    time: 0.4894  last_time: 0.5146  data_time: 0.0205  last_data_time: 0.0237   lr: 0.00021587  max_mem: 2612M
[04/30 06:56:29 d2.utils.events]:  eta: 0:00:09  iter: 279  total_loss: 0.3486  loss_cls: 0.06866  loss_box_reg: 0.1945  loss_mask: 0.07559  loss_rpn_cls: 0.01046  loss_rpn_loc: 0.007445    time: 0.4885  last_time: 0.3725  data_time: 0.0156  last_data_time: 0.0048   lr: 0.00023252  max_mem: 2612M
[04/30 06:56:40 d2.utils.events]:  eta: 0:00:00  iter: 299  total_loss: 0.261  loss_cls: 0.06409  loss_box_reg: 0.1413  loss_mask: 0.06687  loss_rpn_cls: 0.006927  loss_rpn_loc: 0.003735    time: 0.4887  last_time: 0.6228  data_time: 0.0236  last_data_time: 0.0724   lr: 0.00024917  max_mem: 2612M
[04/30 06:56:40 d2.engine.hooks]: Overall training speed: 298 iterations in 0:02:25 (0.4887 s / it)
[04/30 06:56:40 d2.engine.hooks]: Total training time: 0:02:30 (0:00:04 on hooks)

E.4.3 Inference & evaluation using the trained model

Now, let’s run inference with the trained model on the balloon validation dataset. First, let’s create a predictor using the model we just trained:

# Inference should use the config with parameters that are used in training
# cfg now already contains everything we've set previously. We changed it a little bit for inference:
cfg.MODEL.WEIGHTS = os.path.join(cfg.OUTPUT_DIR, "model_final.pth")  # path to the model we just trained
cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.7   # set a custom testing threshold
predictor = DefaultPredictor(cfg)
[04/30 06:57:36 d2.checkpoint.detection_checkpoint]: [DetectionCheckpointer] Loading from ./output/model_final.pth ...

Then, we randomly select several samples to visualize the prediction results.

dataset_dicts = get_balloon_dicts("balloon/val")
for d in random.sample(dataset_dicts, 3):    
    im = cv2.imread(d["file_name"])
    outputs = predictor(im)  # format is documented at https://detectron2.readthedocs.io/tutorials/models.html#model-output-format
    v = Visualizer(im[:, :, ::-1],
                   metadata=balloon_metadata, 
                   scale=0.5, 
                   instance_mode=ColorMode.IMAGE_BW   # remove the colors of unsegmented pixels. This option is only available for segmentation models
    )
    out = v.draw_instance_predictions(outputs["instances"].to("cpu"))
    cv2_imshow(out.get_image()[:, :, ::-1])
Output hidden; open in https://colab.research.google.com to view.

We can also evaluate its performance using AP metric implemented in COCO API. This gives an AP of ~70. Not bad!

evaluator = COCOEvaluator("balloon_val", output_dir="./output")
val_loader = build_detection_test_loader(cfg, "balloon_val")
print(inference_on_dataset(predictor.model, val_loader, evaluator))
# another equivalent way to evaluate the model is to use `trainer.test`
[04/30 06:57:52 d2.evaluation.coco_evaluation]: Fast COCO eval is not built. Falling back to official COCO eval.
[04/30 06:57:52 d2.evaluation.coco_evaluation]: Trying to convert 'balloon_val' to COCO format ...
[04/30 06:57:52 d2.data.datasets.coco]: Converting annotations of dataset 'balloon_val' to COCO format ...)
[04/30 06:57:53 d2.data.datasets.coco]: Converting dataset dicts into COCO format
[04/30 06:57:53 d2.data.datasets.coco]: Conversion finished, #images: 13, #annotations: 50
[04/30 06:57:53 d2.data.datasets.coco]: Caching COCO format annotations at './output/balloon_val_coco_format.json' ...
[04/30 06:57:53 d2.data.build]: Distribution of instances among all 1 categories:
|  category  | #instances   |
|:----------:|:-------------|
|  balloon   | 50           |
|            |              |
[04/30 06:57:53 d2.data.dataset_mapper]: [DatasetMapper] Augmentations used in inference: [ResizeShortestEdge(short_edge_length=(800, 800), max_size=1333, sample_style='choice')]
[04/30 06:57:53 d2.data.common]: Serializing the dataset using: <class 'detectron2.data.common._TorchSerializedList'>
[04/30 06:57:53 d2.data.common]: Serializing 13 elements to byte tensors and concatenating them all ...
[04/30 06:57:53 d2.data.common]: Serialized dataset takes 0.04 MiB
[04/30 06:57:53 d2.evaluation.evaluator]: Start inference on 13 batches
[04/30 06:57:56 d2.evaluation.evaluator]: Inference done 11/13. Dataloading: 0.0014 s/iter. Inference: 0.1043 s/iter. Eval: 0.0053 s/iter. Total: 0.1110 s/iter. ETA=0:00:00
[04/30 06:57:56 d2.evaluation.evaluator]: Total inference time: 0:00:00.923665 (0.115458 s / iter per device, on 1 devices)
[04/30 06:57:56 d2.evaluation.evaluator]: Total inference pure compute time: 0:00:00 (0.103552 s / iter per device, on 1 devices)
[04/30 06:57:56 d2.evaluation.coco_evaluation]: Preparing results for COCO format ...
[04/30 06:57:56 d2.evaluation.coco_evaluation]: Saving results to ./output/coco_instances_results.json
[04/30 06:57:56 d2.evaluation.coco_evaluation]: Evaluating predictions with official COCO API...
Loading and preparing results...
DONE (t=0.00s)
creating index...
index created!
Running per image evaluation...
Evaluate annotation type *bbox*
DONE (t=0.01s).
Accumulating evaluation results...
DONE (t=0.01s).
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.746
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.859
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.838
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.269
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.551
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.901
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.238
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.764
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.764
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.267
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.571
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.923
[04/30 06:57:56 d2.evaluation.coco_evaluation]: Evaluation results for bbox: 
|   AP   |  AP50  |  AP75  |  APs   |  APm   |  APl   |
|:------:|:------:|:------:|:------:|:------:|:------:|
| 74.610 | 85.871 | 83.829 | 26.931 | 55.069 | 90.110 |
Loading and preparing results...
DONE (t=0.00s)
creating index...
index created!
Running per image evaluation...
Evaluate annotation type *segm*
DONE (t=0.02s).
Accumulating evaluation results...
DONE (t=0.01s).
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.779
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.838
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.838
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.236
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.539
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.964
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.254
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.788
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.788
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.233
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.559
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.973
[04/30 06:57:56 d2.evaluation.coco_evaluation]: Evaluation results for segm: 
|   AP   |  AP50  |  AP75  |  APs   |  APm   |  APl   |
|:------:|:------:|:------:|:------:|:------:|:------:|
| 77.919 | 83.829 | 83.829 | 23.564 | 53.851 | 96.356 |
OrderedDict([('bbox', {'AP': 74.60969255997529, 'AP50': 85.87131716204989, 'AP75': 83.82940778549906, 'APs': 26.930693069306926, 'APm': 55.068926123381566, 'APl': 90.10970121205669}), ('segm', {'AP': 77.91911061287945, 'AP50': 83.82940778549906, 'AP75': 83.82940778549906, 'APs': 23.564356435643557, 'APm': 53.85148514851485, 'APl': 96.3558721256741})])

E.4.4 Other types of builtin models

We showcase simple demos of other types of models below:

# Inference with a keypoint detection model
cfg = get_cfg()   # get a fresh new config
cfg.merge_from_file(model_zoo.get_config_file("COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml"))
cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.7  # set threshold for this model
cfg.MODEL.WEIGHTS = model_zoo.get_checkpoint_url("COCO-Keypoints/keypoint_rcnn_R_50_FPN_3x.yaml")
predictor = DefaultPredictor(cfg)
outputs = predictor(im)
v = Visualizer(im[:,:,::-1], MetadataCatalog.get(cfg.DATASETS.TRAIN[0]), scale=1.2)
out = v.draw_instance_predictions(outputs["instances"].to("cpu"))
cv2_imshow(out.get_image()[:, :, ::-1])
Output hidden; open in https://colab.research.google.com to view.
# Inference with a panoptic segmentation model
cfg = get_cfg()
cfg.merge_from_file(model_zoo.get_config_file("COCO-PanopticSegmentation/panoptic_fpn_R_101_3x.yaml"))
cfg.MODEL.WEIGHTS = model_zoo.get_checkpoint_url("COCO-PanopticSegmentation/panoptic_fpn_R_101_3x.yaml")
predictor = DefaultPredictor(cfg)
panoptic_seg, segments_info = predictor(im)["panoptic_seg"]
v = Visualizer(im[:, :, ::-1], MetadataCatalog.get(cfg.DATASETS.TRAIN[0]), scale=1.2)
out = v.draw_panoptic_seg_predictions(panoptic_seg.to("cpu"), segments_info)
cv2_imshow(out.get_image()[:, :, ::-1])
[04/30 06:58:22 d2.checkpoint.detection_checkpoint]: [DetectionCheckpointer] Loading from https://dl.fbaipublicfiles.com/detectron2/COCO-PanopticSegmentation/panoptic_fpn_R_101_3x/139514519/model_final_cafdb1.pkl ...
model_final_cafdb1.pkl: 261MB [00:02, 97.8MB/s]                           

E.5 Image segmentation via fastai

Creating a model that can recognize the content of every individual pixel in an image is called segmentation. Here is how we can train a segmentation model with fastai, using a subset of the Camvid dataset from the paper “Semantic Object Classes in Video: A High-Definition Ground Truth Database” by Gabruel J. Brostow, Julien Fauqueur, and Roberto Cipolla:

path = untar_data(URLs.CAMVID_TINY)
dls = SegmentationDataLoaders.from_label_func(
    path, bs=8, fnames = get_image_files(path/"images"),
    label_func = lambda o: path/'labels'/f'{o.stem}_P{o.suffix}',
    codes = np.loadtxt(path/'codes.txt', dtype=str)
)

learn = unet_learner(dls, resnet34)
learn.fine_tune(8)
100.18% [2318336/2314212 00:00<00:00]
/usr/local/lib/python3.10/dist-packages/torchvision/models/_utils.py:208: UserWarning: The parameter 'pretrained' is deprecated since 0.13 and may be removed in the future, please use 'weights' instead.
  warnings.warn(
/usr/local/lib/python3.10/dist-packages/torchvision/models/_utils.py:223: UserWarning: Arguments other than a weight enum or `None` for 'weights' are deprecated since 0.13 and may be removed in the future. The current behavior is equivalent to passing `weights=ResNet34_Weights.IMAGENET1K_V1`. You can also use `weights=ResNet34_Weights.DEFAULT` to get the most up-to-date weights.
  warnings.warn(msg)
Downloading: "https://download.pytorch.org/models/resnet34-b627a593.pth" to /root/.cache/torch/hub/checkpoints/resnet34-b627a593.pth
100%|██████████| 83.3M/83.3M [00:00<00:00, 130MB/s]
epoch train_loss valid_loss time
0 3.086789 2.394335 00:02
epoch train_loss valid_loss time
0 1.912205 1.644341 00:01
1 1.613313 1.217818 00:01
2 1.505060 1.243426 00:01
3 1.352298 0.977219 00:01
4 1.208431 0.888610 00:01
5 1.088054 0.788914 00:01
6 0.990660 0.758133 00:01
7 0.914747 0.755208 00:01

We can visualize how well it achieved its task, by asking the model to color-code each pixel of an image. As you can see, it nearly perfectly classifies every pixel in every object. For instance, notice that all of the cars are overlaid with the same color and all of the trees are overlaid with the same color (in each pair of images, the lefthand image is the ground truth label and the right is the prediction from the model):

learn.show_results(max_n=4, figsize=(7,8))

E.6 Data cleaning with CleanVision

CleanVision is built to automatically detects various issues in image datasets. This data-centric AI package is designed as a quick first step for any computer vision project to find problems in your dataset, which you may want to address before applying machine learning. The following Issue Key column specifies the name for each type of issue in CleanVision code.

Issue Type Description Issue Key
1 Light Images that are too bright/washed out in the dataset light
2 Dark Images that are irregularly dark dark
3 Odd Aspect Ratio Images with an unusual aspect ratio (i.e. overly skinny/wide) odd_aspect_ratio
4 Exact Duplicates Images that are exact duplicates of each other exact_duplicates
5 Near Duplicates Images that are almost visually identical to each other (e.g. same image with different filters) near_duplicates
6 Blurry Images that are blurry or out of focus blurry
7 Grayscale Images that are grayscale (lacking color) grayscale
8 Low Information Images that lack much information (e.g. a completely black image with a few white dots) low_information
!wget - nc 'https://cleanlab-public.s3.amazonaws.com/CleanVision/image_files.zip'
!unzip -q image_files.zip
--2023-04-30 06:34:14--  http://-/
Resolving - (-)... failed: Name or service not known.
wget: unable to resolve host address ‘-’
--2023-04-30 06:34:14--  http://nc/
Resolving nc (nc)... failed: No address associated with hostname.
wget: unable to resolve host address ‘nc’
--2023-04-30 06:34:14--  https://cleanlab-public.s3.amazonaws.com/CleanVision/image_files.zip
Resolving cleanlab-public.s3.amazonaws.com (cleanlab-public.s3.amazonaws.com)... 52.216.108.19, 52.217.203.193, 52.217.33.132, ...
Connecting to cleanlab-public.s3.amazonaws.com (cleanlab-public.s3.amazonaws.com)|52.216.108.19|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 78293407 (75M) [application/zip]
Saving to: ‘image_files.zip’

image_files.zip     100%[===================>]  74.67M  67.9MB/s    in 1.1s    

2023-04-30 06:34:16 (67.9 MB/s) - ‘image_files.zip’ saved [78293407/78293407]

FINISHED --2023-04-30 06:34:16--
Total wall clock time: 1.4s
Downloaded: 1 files, 75M in 1.1s (67.9 MB/s)
# Path to your dataset, you can specify your own dataset path
dataset_path = "./image_files/"

# Initialize imagelab with your dataset
imagelab = Imagelab(data_path=dataset_path)

# Visualize a few sample images from the dataset
imagelab.visualize(num_images=8)
Reading images from /content/image_files
Sample images from the dataset

# Find issues
# You can also specify issue types to detect, for example
# issue_types = {"dark": {}}
# imagelab.find_issues(issue_types)
imagelab.find_issues()
Checking for dark, light, odd_aspect_ratio, low_information, exact_duplicates, near_duplicates, blurry, grayscale images ...
100%|██████████| 595/595 [00:01<00:00, 417.82it/s]
100%|██████████| 595/595 [00:00<00:00, 728.59it/s] 
Issue checks completed. To see a detailed report of issues found, use imagelab.report().

The report() method helps you quickly understand the major issues detected in the dataset. It reports the number of images in the dataset that exhibit each type of issue, and shows example images corresponding to the most severe instances of each issue.

imagelab.report()
Issues found in order of severity in the dataset

|    | issue_type       |   num_images |
|---:|:-----------------|-------------:|
|  0 | grayscale        |           20 |
|  1 | near_duplicates  |           20 |
|  2 | exact_duplicates |           19 |
|  3 | dark             |           13 |
|  4 | blurry           |           10 |
|  5 | odd_aspect_ratio |            8 |
|  6 | light            |            5 |
|  7 | low_information  |            4 | 


Top 4 examples with grayscale issue in the dataset.


Top 4 sets of images with near_duplicates issue
Set: 0

Set: 1

Set: 2

Set: 3


Top 4 sets of images with exact_duplicates issue
Set: 0

Set: 1

Set: 2

Set: 3


Top 4 examples with dark issue in the dataset.


Top 4 examples with blurry issue in the dataset.


Top 4 examples with odd_aspect_ratio issue in the dataset.


Top 4 examples with light issue in the dataset.


Top 4 examples with low_information issue in the dataset.

The main way to interface with your data is via the Imagelab class. This class can be used to understand the issues in your dataset at a high level (global overview) and low level (issues and quality scores for each image) as well as additional information about the dataset. It has three main attributes:

  • Imagelab.issue_summary
  • Imagelab.issues
  • Imagelab.info

E.6.1 imagelab.issue_summary

Dataframe with global summary of all issue types detected in your dataset and the overall prevalence of each type.

In each row: - issue_type - name of the issue - num_images - number of images of that issue type found in the dataset

imagelab.issue_summary
issue_type num_images
0 grayscale 20
1 near_duplicates 20
2 exact_duplicates 19
3 dark 13
4 blurry 10
5 odd_aspect_ratio 8
6 light 5
7 low_information 4

E.6.2 imagelab.issues

DataFrame assessing each image in your dataset, reporting which issues each image exhibits and a quality score for each type of issue.

imagelab.issues.head()
odd_aspect_ratio_score is_odd_aspect_ratio_issue low_information_score is_low_information_issue light_score is_light_issue grayscale_score is_grayscale_issue dark_score is_dark_issue blurry_score is_blurry_issue is_exact_duplicates_issue is_near_duplicates_issue
/content/image_files/image_0.png 1.0 False 0.806332 False 0.925490 False 1 False 1.000000 False 0.373038 False False False
/content/image_files/image_1.png 1.0 False 0.923116 False 0.906609 False 1 False 0.990676 False 0.345064 False False False
/content/image_files/image_10.png 1.0 False 0.875129 False 0.995127 False 1 False 0.795937 False 0.534317 False False False
/content/image_files/image_100.png 1.0 False 0.916140 False 0.889762 False 1 False 0.827587 False 0.494283 False False False
/content/image_files/image_101.png 1.0 False 0.779338 False 0.960784 False 0 True 0.992157 False 0.471333 False False False

There is a Boolean column for each issue type, showing whether each image exhibits that type of issue or not. For example, the rows where the is_dark_issue column contains True, those rows correspond to images that appear too dark. For the dark issue type (and more generally for other types of issues), there is a numeric column dark_score, which assesses how severe this issue is in each image. These quality scores lie between 0 and 1, where lower values indicate more severe instances of the issue (images which are darker in this example).

One use-case for imagelab.issues is to filter out all images exhibiting one particular type of issue and rank them by their quality score. Here’s how to get all blurry images ranked by their blurry_score, note lower scores indicate higher severity:

blurry_images = imagelab.issues[imagelab.issues["is_blurry_issue"] == True].sort_values(by=['blurry_score'])
blurry_image_files = blurry_images.index.tolist()
imagelab.visualize(image_files=blurry_image_files[:4])

The imagelab.visualize() also allows you can use to see examples of specific issues in your dataset. num_images and cell_size are optional arguments, that you can use to control number of examples of each issue type and size of each image in the grid respectively.

issue_types = ["grayscale"]
imagelab.visualize(issue_types=issue_types, num_images=8, cell_size=(3, 3))

Top 8 examples with grayscale issue in the dataset.

E.6.3 imagelab.info

This is a nested dictionary containing statistics about the images and other miscellaneous information stored while checking for issues in the dataset Possible keys in this dict are statistics and a key corresponding to each issue type

imagelab.info.keys()
dict_keys(['statistics', 'dark', 'light', 'odd_aspect_ratio', 'low_information', 'blurry', 'grayscale', 'exact_duplicates', 'near_duplicates'])

imagelab.info['statistics'] is also a dict containing statistics calculated on images that are used for checking for issues in the dataset.

imagelab.info['statistics'].keys()
dict_keys(['brightness', 'aspect_ratio', 'entropy', 'blurriness', 'color_space'])

imagelab.info can also be used to retrieve which images are near or exact duplicates of each other. issue.summary shows the number of exact duplicate images but does not show how many such sets of duplicates images exist in the dataset. To see the number of exact duplicate sets, you can use imagelab.info:

imagelab.info['exact_duplicates']['num_sets']
9

You can also get exactly which images are there in each (exact/near) duplicated set using imagelab.info.

imagelab.info['exact_duplicates']['sets']
[['/content/image_files/image_142.png', '/content/image_files/image_236.png'],
 ['/content/image_files/image_170.png', '/content/image_files/image_299.png'],
 ['/content/image_files/image_190.png', '/content/image_files/image_197.png'],
 ['/content/image_files/image_288.png', '/content/image_files/image_289.png'],
 ['/content/image_files/image_292.png',
  '/content/image_files/image_348.png',
  '/content/image_files/image_492.png'],
 ['/content/image_files/image_30.png', '/content/image_files/image_55.png'],
 ['/content/image_files/image_351.png', '/content/image_files/image_372.png'],
 ['/content/image_files/image_379.png', '/content/image_files/image_579.png'],
 ['/content/image_files/image_550.png', '/content/image_files/image_7.png']]

E.6.4 Check for an issue with a different threshold

You can use the loaded imagelab instance to check for an issue type with a custom hyperparameter. Here is a table of hyperparameters that each issue type supports and their permissible values:

  • threshold- All images with scores below this threshold will be flagged as an issue.

  • hash_size - This controls how much detail about an image we want to keep for getting perceptual hash. Higher sizes imply more detail.

  • hash_type - Type of perceptual hash to use. Currently whash and phash are the supported hash types. Check here for more details on these hash types.

Issue Key Hyperparameters
1 light threshold (between 0 and 1)
2 dark threshold (between 0 and 1)
3 odd_aspect_ratio threshold (between 0 and 1)
4 exact_duplicates N/A
5 near_duplicates hash_size (power of 2), hash_types (whash, phash)
6 blurry threshold (between 0 and 1)
7 grayscale threshold (between 0 and 1)
8 low_information threshold (between 0 and 1)
issue_types = {"dark": {"threshold": 0.2}}
imagelab.find_issues(issue_types)

imagelab.report(issue_types)
Checking for dark images ...
Issue checks completed. To see a detailed report of issues found, use imagelab.report().
Issues found in order of severity in the dataset

|    | issue_type   |   num_images |
|---:|:-------------|-------------:|
|  5 | dark         |            8 | 


Top 4 examples with dark issue in the dataset.

Note the number of images with dark issue has reduced from the previous run!

E.6.5 Save and load

CleanVision also has a save and load functionality that you can use to save the results and load them at a later point in time to see results or run more checks. For saving, specify force=True to overwrite existing files:

save_path = "./results"
imagelab.save(save_path)
Saved Imagelab to folder: ./results
The data path and dataset must be not be changed to maintain consistent state when loading this Imagelab
## For loading a saved instance, specify `dataset_path` 
## to help check for any inconsistencies between dataset paths in the previous and current run.
imagelab = Imagelab.load(save_path, dataset_path)
Successfully loaded Imagelab

E.7 Lable issue with Cleanlab

mnist = fetch_openml("mnist_784")  # Fetch the MNIST dataset

X = mnist.data.astype("float32").to_numpy() # 2D array (images are flattened into 1D)
X /= 255.0  # Scale the features to the [0, 1] range

X = X.reshape(len(X), 1, 28, 28)  # reshape into [N, C, H, W] for PyTorch
labels = mnist.target.astype("int64").to_numpy()  # 1D array of given labels
/usr/local/lib/python3.10/dist-packages/sklearn/datasets/_openml.py:968: FutureWarning: The default value of `parser` will change from `'liac-arff'` to `'auto'` in 1.4. You can set `parser='auto'` to silence this warning. Therefore, an `ImportError` will be raised from 1.4 if the dataset is dense and pandas is not installed. Note that the pandas parser may return different data types. See the Notes Section in fetch_openml's API doc for details.
  warn(

E.7.1 Ensure your classifier is scikit-learn compatible

Here, we define a simple neural network with Pytroch

# We use subclassing API here
class ClassifierModule(nn.Module):
    def __init__(self):
        super().__init__()

        self.cnn = nn.Sequential(
            nn.Conv2d(1, 6, 3),
            nn.ReLU(),
            nn.BatchNorm2d(6),
            nn.MaxPool2d(kernel_size=2, stride=2),
            nn.Conv2d(6, 16, 3),
            nn.ReLU(),
            nn.BatchNorm2d(16),
            nn.MaxPool2d(kernel_size=2, stride=2),
        )
        self.out = nn.Sequential(
            nn.Flatten(),
            nn.LazyLinear(128), # A torch.nn.Linear module where in_features is inferred!
            nn.ReLU(),
            nn.Linear(128, 10),
            nn.Softmax(dim=-1),
        )

    def forward(self, X):
        X = self.cnn(X)
        X = self.out(X)
        return X

As some cleanlab features require scikit-learn compatibility, we adapt the above keras neural net accordingly. skorch is a convenient package that helps with this:

clf = NeuralNetClassifier(ClassifierModule)

E.7.2 Compute out-of-sample predicted probabilities

If we’d like cleanlab to identify potential label errors in the whole dataset and not just the training set, we can consider using the entire dataset when computing the out-of-sample predicted probabilities, pred_probs, via cross-validation.

num_crossval_folds = 3  # for efficiency; values like 5 or 10 will generally work better
pred_probs = cross_val_predict(
    clf,
    X,
    labels,
    cv=num_crossval_folds,
    method="predict_proba",
)
/usr/local/lib/python3.10/dist-packages/torch/nn/modules/lazy.py:180: UserWarning: Lazy modules are a new feature under heavy development so changes to the API or functionality can happen at any moment.
  warnings.warn('Lazy modules are a new feature under heavy development '
  epoch    train_loss    valid_acc    valid_loss     dur
-------  ------------  -----------  ------------  ------
      1        0.7562       0.9165        0.3194  2.2443
      2        0.2114       0.9421        0.1991  2.4727
      3        0.1476       0.9547        0.1553  2.1834
      4        0.1185       0.9610        0.1323  2.1328
      5        0.1008       0.9643        0.1175  2.1259
      6        0.0889       0.9660        0.1079  2.1473
      7        0.0800       0.9694        0.1005  2.2020
      8        0.0733       0.9712        0.0949  2.7131
      9        0.0678       0.9724        0.0899  2.7966
     10        0.0633       0.9731        0.0862  2.4333
/usr/local/lib/python3.10/dist-packages/torch/nn/modules/lazy.py:180: UserWarning: Lazy modules are a new feature under heavy development so changes to the API or functionality can happen at any moment.
  warnings.warn('Lazy modules are a new feature under heavy development '
  epoch    train_loss    valid_acc    valid_loss     dur
-------  ------------  -----------  ------------  ------
      1        0.7694       0.9164        0.3139  2.1668
      2        0.2229       0.9447        0.2009  2.3349
      3        0.1564       0.9556        0.1608  2.3366
      4        0.1261       0.9598        0.1386  2.1405
      5        0.1074       0.9631        0.1240  2.1643
      6        0.0944       0.9654        0.1130  2.1670
      7        0.0846       0.9678        0.1041  2.1175
      8        0.0767       0.9702        0.0975  2.5494
      9        0.0704       0.9726        0.0919  2.1594
     10        0.0651       0.9736        0.0875  2.1436
/usr/local/lib/python3.10/dist-packages/torch/nn/modules/lazy.py:180: UserWarning: Lazy modules are a new feature under heavy development so changes to the API or functionality can happen at any moment.
  warnings.warn('Lazy modules are a new feature under heavy development '
  epoch    train_loss    valid_acc    valid_loss     dur
-------  ------------  -----------  ------------  ------
      1        0.7947       0.9176        0.3226  2.3926
      2        0.2277       0.9468        0.1976  2.4256
      3        0.1590       0.9559        0.1524  2.4540
      4        0.1276       0.9618        0.1291  2.1569
      5        0.1087       0.9651        0.1149  2.1608
      6        0.0959       0.9674        0.1052  2.1491
      7        0.0863       0.9687        0.0981  2.1503
      8        0.0789       0.9714        0.0922  2.2473
      9        0.0727       0.9728        0.0879  2.3513
     10        0.0676       0.9738        0.0839  2.1506

An additional benefit of cross-validation is that it facilitates more reliable evaluation of our model than a single training/validation split.

predicted_labels = pred_probs.argmax(axis=1)
acc = accuracy_score(labels, predicted_labels)
print(f"Cross-validated estimate of accuracy on held-out data: {acc}")
Cross-validated estimate of accuracy on held-out data: 0.9765857142857143

E.7.3 Use cleanlab to find label issues

Based on the given labels and out-of-sample predicted probabilities, cleanlab can quickly help us identify label issues in our dataset. For a dataset with N examples from K classes, the labels should be a 1D array of length N and predicted probabilities should be a 2D (N x K) array. Here we request that the indices of the identified label issues be sorted by cleanlab’s self-confidence score, which measures the quality of each given label via the probability assigned to it in our model’s prediction.

ranked_label_issues = find_label_issues(
    labels,
    pred_probs,
    return_indices_ranked_by="self_confidence",
)

print(f"Cleanlab found {len(ranked_label_issues)} label issues.")
print(f"Top 15 most likely label errors: \n {ranked_label_issues[:15]}")
Cleanlab found 136 label issues.
Top 15 most likely label errors: 
 [59915 24798 28556  8200 26882  6448 63520  7010  1604   902 51248 23824
 20672 22643 53216]

ranked_label_issues() is a list of indices corresponding to examples that are worth inspecting more closely.

Let’s look at the top 15 examples cleanlab thinks are most likely to be incorrectly labeled. We can see a few label errors and odd edge cases. Feel free to change the values below to display more/fewer examples.

plot_examples(ranked_label_issues[range(15)], 3, 5)

Let’s zoom into some specific examples from the above set:

Given label is 4 but looks more like a 7:

plot_examples([59915])

Given label is 4 but looks more like a 9:

plot_examples([24798])

A very odd looking 6:

plot_examples([63520])

cleanlab has shortlisted the most likely label errors to speed up your data cleaning process. With this list, you can decide whether to fix label issues or prune some of these examples from the dataset.

E.8 References

  1. https://github.com/ageron/handson-ml3/

  2. https://github.com/fchollet/deep-learning-with-python-notebooks_

  3. https://github.com/fastai/fastbook2e

  4. https://github.com/facebookresearch/detectron2

  5. https://github.com/cleanlab/cleanlab

  6. https://github.com/cleanlab/cleanvision