当前位置: 首页 > news >正文

网站优化推广费用微信公众号电商网站开发

网站优化推广费用,微信公众号电商网站开发,中企动力 网站价格,阜宁网站制作收费标准文章目录 嫌墨迹直接看代码Q5 Self-Supervised Learning for Image Classificationcompute_train_transform CIFAR10Pair.__getitem__()题面解析代码输出 simclr_loss_naive题面解析代码输出 sim_positive_pairs题面解析代码输出 compute_sim_matrix题面解析代码输出 simclr_lo… 文章目录 嫌墨迹直接看代码Q5 Self-Supervised Learning for Image Classificationcompute_train_transform CIFAR10Pair.__getitem__()题面解析代码输出 simclr_loss_naive题面解析代码输出 sim_positive_pairs题面解析代码输出 compute_sim_matrix题面解析代码输出 simclr_loss_vectorized题面解析代码输出 train题面解析代码输出 结语 嫌墨迹直接看代码 Q5 Self-Supervised Learning for Image Classification compute_train_transform CIFAR10Pair.getitem() 题面 解析 这里就是让我们对图片进行转换具体的转换步骤在上面都写了我们只需要查阅相应的api进行调用就好了 代码 def compute_train_transform(seed123456):This function returns a composition of data augmentations to a single training image.Complete the following lines. Hint: look at available functions in torchvision.transformsrandom.seed(seed)torch.random.manual_seed(seed)# Transformation that applies color jitter with brightness0.4, contrast0.4, saturation0.4, and hue0.1color_jitter transforms.ColorJitter(0.4, 0.4, 0.4, 0.1)train_transform transforms.Compose([############################################################################### TODO: Start of your code. ## ## Hint: Check out transformation functions defined in torchvision.transforms ## The first operation is filled out for you as an example.############################################################################### Step 1: Randomly resize and crop to 32x32.transforms.RandomResizedCrop(32),# Step 2: Horizontally flip the image with probability 0.5transforms.RandomHorizontalFlip(p0.5),# Step 3: With a probability of 0.8, apply color jitter (you can use color_jitter defined above.transforms.RandomApply(torch.nn.ModuleList([color_jitter]), p0.8),# Step 4: With a probability of 0.2, convert the image to grayscaletransforms.RandomGrayscale(p0.2),############################################################################### END OF YOUR CODE ###############################################################################transforms.ToTensor(),transforms.Normalize([0.4914, 0.4822, 0.4465], [0.2023, 0.1994, 0.2010])])return train_transformdef compute_test_transform():test_transform transforms.Compose([transforms.ToTensor(),transforms.Normalize([0.4914, 0.4822, 0.4465], [0.2023, 0.1994, 0.2010])])return test_transformclass CIFAR10Pair(CIFAR10):CIFAR10 Dataset.def __getitem__(self, index):img, target self.data[index], self.targets[index]img Image.fromarray(img)x_i Nonex_j Noneif self.transform is not None:############################################################################### TODO: Start of your code. ## ## Apply self.transform to the image to produce x_i and x_j in the paper ###############################################################################x_i self.transform(img)x_j self.transform(img)############################################################################### END OF YOUR CODE ###############################################################################if self.target_transform is not None:target self.target_transform(target)return x_i, x_j, target 输出 注意这里我不知道为啥本地跑出来的结果是有误差的但是同样的代码放到colab上跑就没事了很奇怪我只能说 simclr_loss_naive 题面 让我们用简单的方法来计算SimCLR的loss 解析 看看上面的题面一步一步来就好了 代码 def sim(z_i, z_j):Normalized dot product between two vectors.Inputs:- z_i: 1xD tensor.- z_j: 1xD tensor.Returns:- A scalar value that is the normalized dot product between z_i and z_j.norm_dot_product None############################################################################### TODO: Start of your code. ## ## HINT: torch.linalg.norm might be helpful. ################################################################################ torch.linalg.norm 相对于对一个向量求范数返回的是一个标量norm_dot_product torch.dot(z_i, z_j) / (torch.linalg.norm(z_i) * torch.linalg.norm(z_j))############################################################################### END OF YOUR CODE ###############################################################################return norm_dot_productdef simclr_loss_naive(out_left, out_right, tau):Compute the contrastive loss L over a batch (naive loop version).Input:- out_left: NxD tensor; output of the projection head g(), left branch in SimCLR model.- out_right: NxD tensor; output of the projection head g(), right branch in SimCLR model.Each row is a z-vector for an augmented sample in the batch. The same row in out_left and out_right form a positive pair. In other words, (out_left[k], out_right[k]) form a positive pair for all k0...N-1.- tau: scalar value, temperature parameter that determines how fast the exponential increases.Returns:- A scalar value; the total loss across all positive pairs in the batch. See notebook for definition.N out_left.shape[0] # total number of training examples# Concatenate out_left and out_right into a 2*N x D tensor.out torch.cat([out_left, out_right], dim0) # [2*N, D]total_loss 0for k in range(N): # loop through each positive pair (k, kN)z_k, z_k_N out[k], out[k N]############################################################################### TODO: Start of your code. ## ## Hint: Compute l(k, kN) and l(kN, k). ################################################################################ *****START OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****# 计算 l(k, kN)# 计算左边的分子left_numerator (sim(z_k, z_k_N) / tau).exp()# 计算左边的分母中需要进行sim运算的元素left_need_sim out[np.arange(2 * N) ! k]# 计算左边的分母left_denominator torch.tensor([sim(z_k, z_i) / tau for z_i in left_need_sim]).exp().sum()# 计算左边的结果left -(left_numerator / left_denominator).log()# 计算 l(kN, k)# 计算右边的分子right_numerator (sim(z_k_N, z_k) / tau).exp()# 计算右边的分母中需要进行sim运算的元素right_need_sim out[np.arange(2 * N) ! k N]# 计算右边的分母right_denominator torch.tensor([sim(z_k_N, z_i) / tau for z_i in right_need_sim]).exp().sum()# 计算右边的结果right -(right_numerator / right_denominator).log()total_loss left right# *****END OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****############################################################################### END OF YOUR CODE ################################################################################ In the end, we need to divide the total loss by 2N, the number of samples in the batch.total_loss total_loss / (2 * N)return total_loss输出 sim_positive_pairs 题面 这里跟之前的区别在于之前只需要算两个向量之间的sim现在算的是两个矩阵之间的sim 解析 看代码注释吧 代码 def sim_positive_pairs(out_left, out_right):Normalized dot product between positive pairs.Inputs:- out_left: NxD tensor; output of the projection head g(), left branch in SimCLR model.- out_right: NxD tensor; output of the projection head g(), right branch in SimCLR model.Each row is a z-vector for an augmented sample in the batch.The same row in out_left and out_right form a positive pair.Returns:- A Nx1 tensor; each row k is the normalized dot product between out_left[k] and out_right[k].pos_pairs None############################################################################### TODO: Start of your code. ## ## HINT: torch.linalg.norm might be helpful. ################################################################################ *****START OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****# 看看公式sim 是一个除法但是可以看做两个除法相乘norm_left out_left / torch.linalg.norm(out_left, dim1, keepdimTrue)norm_right out_right / torch.linalg.norm(out_right, dim1, keepdimTrue)pos_pairs torch.sum(norm_left * norm_right, dim1, keepdimTrue)# *****END OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****############################################################################### END OF YOUR CODE ###############################################################################return pos_pairs输出 compute_sim_matrix 题面 这个任务应该就是让我们输出sim_matrix,matrix是一个2N * 2N的矩阵matrix[i,j]表示sim(i,j) 所以就处理下直接乘以自己的转置就好了 解析 看上面或者看代码理解一下 代码 def compute_sim_matrix(out):Compute a 2N x 2N matrix of normalized dot products between all pairs of augmented examples in a batch.Inputs:- out: 2N x D tensor; each row is the z-vector (output of projection head) of a single augmented example.There are a total of 2N augmented examples in the batch.Returns:- sim_matrix: 2N x 2N tensor; each element i, j in the matrix is the normalized dot product between out[i] and out[j].sim_matrix None############################################################################### TODO: Start of your code. ################################################################################ *****START OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****out_norm out / torch.linalg.norm(out, dim1, keepdimTrue)sim_matrix out_norm out_norm.T# *****END OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****############################################################################### END OF YOUR CODE ###############################################################################return sim_matrix输出 simclr_loss_vectorized 题面 就是让我们实现序列化的计算loss,按照步骤一步一步来就好了 解析 看注释吧 代码 def simclr_loss_vectorized(out_left, out_right, tau, devicecuda):Compute the contrastive loss L over a batch (vectorized version). No loops are allowed.Inputs and output are the same as in simclr_loss_naive.N out_left.shape[0]# Concatenate out_left and out_right into a 2*N x D tensor.out torch.cat([out_left, out_right], dim0) # [2*N, D]# Compute similarity matrix between all pairs of augmented examples in the batch.sim_matrix compute_sim_matrix(out) # [2*N, 2*N]############################################################################### TODO: Start of your code. Follow the hints. ################################################################################ Step 1: Use sim_matrix to compute the denominator value for all augmented samples.# Hint: Compute e^{sim / tau} and store into exponential, which should have shape 2N x 2N.exponential (sim_matrix / tau).exp().to(device)# This binary mask zeros out terms where ki.mask (torch.ones_like(exponential, devicedevice) - torch.eye(2 * N, devicedevice)).to(device).bool()# We apply the binary mask.exponential exponential.masked_select(mask).view(2 * N, -1) # [2*N, 2*N-1]# Hint: Compute the denominator values for all augmented samples. This should be a 2N x 1 vector.denom exponential.sum(dim1)# Step 2: Compute similarity between positive pairs.# You can do this in two ways: # Option 1: Extract the corresponding indices from sim_matrix. # Option 2: Use sim_positive_pairs().# *****START OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****# 计算出所有的正样本对的相似度sim_pairs sim_positive_pairs(out_left, out_right).to(device)# 拼接矩阵因为正样本对是对称的所以拼接两次sim_pairs torch.cat([sim_pairs, sim_pairs], dim0)# *****END OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****# Step 3: Compute the numerator value for all augmented samples.numerator None# *****START OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****numerator (sim_pairs / tau).exp()# *****END OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****# Step 4: Now that you have the numerator and denominator for all augmented samples, compute the total loss.loss None# *****START OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****loss torch.mean(-torch.log(numerator / denom))# *****END OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****############################################################################### END OF YOUR CODE ###############################################################################return loss输出 train 题面 解析 这里就是让我们将x_i和x_j变成经过处理的gi 和 gj 然后计算loss就好了 注意一下这个训练预计要吃5G的显存如果你的显卡显存不够量力而行我的显卡是1060显存3个g所以为了训练这个模型他额外吃了系统内存但是降低了训练速度。 代码 输出 训练了差不多一个小时 结语 这样这个实验就做完了虽然自己实现了simSLR,但是感觉对这个模型的总体还不是特别清楚细节部分倒是清楚了等到时候所有cs231n作业做完再回过头理解一下吧
http://www.hkea.cn/news/14429816/

相关文章:

  • 目前市面上做网站的程序图文制作app
  • 网站建设大作业论文sasaki景观设计公司官网
  • 网站开发游戏程序开发仿站 做网站
  • 上海缘魁网站建设寻找聊城做网站的公司
  • 潍坊做网站泉州模板建站公司
  • 甘肃省住房与城乡建设厅网站首页chinacd.wordpress.net
  • 舟山市普陀区建设局网站心理网站的建设与维护
  • 成都建网站比较好的公司6网站建设朋友圈广告语
  • 做网站需要做手机版吗node.js 网站开发 公司 广州
  • 电子销售网站模板免费下载王野发动机怎么样
  • 中国工程建设管理协会网站wordpress获取tag第一篇文章
  • php做网站参考文献建立网站所需的硬件和软件
  • html5网站制作教程湖南网站制作收费标准
  • 外包做网站价格wordpress博客支出
  • 企业手机网站建设公司wordpress 标签云制作
  • 外贸网站建设知识 列表网站开发好
  • 涿州网站网站建设做好网络推广
  • 无锡网站建设专家wordpress收录差劲啊
  • 哪些网站是百度新闻源西安官网seo方法
  • 岳阳二手房网站百度查询关键词排名工具
  • 网站水军怎么做企业门户网站平台建设招标采购文件
  • 网站开发采用的技术方案说明火星wap建站
  • 网站备案如何查询黄村网站建设一条龙
  • 免费网站模版建设小型深圳网站页面设计
  • 网站搭建怎么弄的自适应网站建设优化建站
  • 网站设计要如何做支付功能十大淘宝运营电商公司
  • 深圳广告公司前十强泉州seo报价
  • 企业网站建设需要什么广西网络电视
  • 如何在门户网站发表文章上海网站设计专注乐云seo
  • 郑州网站建站模板产品详情页模板免费下载